The AMD Instinct MI355X packs 288GB VRAM. Code Llama 34B needs ~18.7GB. You're good to go.
ollama run codellama:34b
Requires Ollama installed on your machine
MSRP: $15,000
Check GPU compatibility for any AI model
Compare pricing across 24+ providers
Side-by-side GPU specs and benchmarks
GPU × Model compatibility matrix