H100 NVL Can Run Llama 3.2 1B

The H100 NVL packs 188GB VRAM. Llama 3.2 1B needs ~0.6GB. You're good to go.

✅ Plenty of headroom

Hardware Specs

GPU
H100 NVL
VRAM
188 GB
Bandwidth
3350 GB/s
MSRP
$0

Model Requirements

Model
Llama 3.2 1B
Parameters
1B
VRAM (Q4)
~0.6 GB
Context
N/A

Estimated Performance

5025 tok/s
Estimated inference speed (Q4 quantization)
0