The TPU v5p packs 128GB VRAM. Llama 3.3 70B Instruct needs ~42GB. You're good to go.
Check GPU compatibility for any AI model
Compare pricing across 24+ providers
Side-by-side GPU specs and benchmarks
GPU × Model compatibility matrix