❌ — Radeon Instinct MI300A Can't Run Llama 3.1 405B
The Radeon Instinct MI300A only has 192GB VRAM. Llama 3.1 405B needs ~222.8GB. Time to rent or upgrade.
❌ Not enough VRAM
Need more VRAM?
Rent cloud GPUs with 222.8GB+ VRAM to run Llama 3.1 405B
Browse Cloud GPUs