The H100 NVL only has 188GB VRAM. Llama 3.1 405B needs ~222.8GB. Time to rent or upgrade.
Need more VRAM?
Rent cloud GPUs with 222.8GB+ VRAM to run Llama 3.1 405B
Check GPU compatibility for any AI model
Compare pricing across 24+ providers
Side-by-side GPU specs and benchmarks
GPU × Model compatibility matrix