TPU v6e Can Run Llama 3.1 405B

The TPU v6e packs 256GB VRAM. Llama 3.1 405B needs ~222.8GB. You're good to go.

✅ Plenty of headroom

Hardware Specs

GPU
TPU v6e
VRAM
256 GB
Bandwidth
4500 GB/s
MSRP
$0

Model Requirements

Model
Llama 3.1 405B
Parameters
405B
VRAM (Q4)
~222.8 GB
Context
N/A

Estimated Performance

17 tok/s
Estimated inference speed (Q4 quantization)
0