❌ — RTX 4090 Can't Run Llama 3.2 90B Vision
The RTX 4090 only has 24GB VRAM. Llama 3.2 90B Vision needs ~49.5GB. Time to rent or upgrade.
❌ Not enough VRAM
Need more VRAM?
Rent cloud GPUs with 49.5GB+ VRAM to run Llama 3.2 90B Vision
Browse Cloud GPUs