❌ — GeForce RTX 4090 D Can't Run Llama 3.1 70B
The GeForce RTX 4090 D only has 24GB VRAM. Llama 3.1 70B needs ~38.5GB. Time to rent or upgrade.
❌ Not enough VRAM
Need more VRAM?
Rent cloud GPUs with 38.5GB+ VRAM to run Llama 3.1 70B
Browse Cloud GPUs