The GB200 NVL72 packs 13824GB VRAM. Vicuna 33B needs ~18.2GB. You're good to go.
ollama run vicuna:33b
Requires Ollama installed on your machine
Check GPU compatibility for any AI model
Compare pricing across 24+ providers
Side-by-side GPU specs and benchmarks
GPU × Model compatibility matrix