✅ — AWS EC2 P5 Instance (8x H100) Can Run Llama 3.1 8B
The AWS EC2 P5 Instance (8x H100) packs 640GB VRAM. Llama 3.1 8B needs ~4.4GB. You're good to go.
✅ Plenty of headroom
Hardware Specs
- GPU
- AWS EC2 P5 Instance (8x H100)
- VRAM
- 640 GB
- Bandwidth
- 26400 GB/s
- MSRP
- $0
Model Requirements
- Model
- Llama 3.1 8B
- Parameters
- 8B
- VRAM (Q4)
- ~4.4 GB
- Context
- N/A
Estimated Performance
4950 tok/s
Estimated inference speed (Q4 quantization)