B100 Can Run Llama 3.2 1B

The B100 packs 192GB VRAM. Llama 3.2 1B needs ~0.6GB. You're good to go.

✅ Plenty of headroom

Hardware Specs

GPU
B100
VRAM
192 GB
Bandwidth
3350 GB/s
MSRP
$0

Model Requirements

Model
Llama 3.2 1B
Parameters
1B
VRAM (Q4)
~0.6 GB
Context
N/A

Estimated Performance

5025 tok/s
Estimated inference speed (Q4 quantization)
0