The GeForce GTX 1060 5 GB packs 5GB VRAM. Llama 3.1 8B needs ~4.4GB. It'll work, but don't expect to load much context.
ollama run llama3.1:8b
Requires Ollama installed on your machine
Check GPU compatibility for any AI model
Compare pricing across 24+ providers
Side-by-side GPU specs and benchmarks
GPU × Model compatibility matrix