The RTX 4090 packs 24GB VRAM. Qwen3 32B needs ~17.6GB. You're good to go.
ollama run qwen3:32b
Requires Ollama installed on your machine
MSRP: $1,599
Check GPU compatibility for any AI model
Compare pricing across 24+ providers
Side-by-side GPU specs and benchmarks
GPU × Model compatibility matrix