Built for r/LocalLLaMA, r/homelab & AI Communities

Stop Guessing.
Know What You Can Run.

The AI hardware compatibility platform built by the community, for the community. Get instant answers to "Will X model fit on my GPU?"

Reddit Community Special

1 Month Free Pro Access

All premium tools unlocked • No credit card required • Cancel anytime

Or try the quick tool without signing up

What's Included in Your Free Month

No credit card required for trial
All premium tools unlocked
Shareable configuration URLs
Physics-based performance predictions
Real-time hardware pricing data
Multi-GPU setup planning

What the Community Says

"Finally, a tool that actually tells me if I can run a model before I download 100GB"

r/LocalLLaMA user

RTX 3090 owner

"The inference simulator saved me from buying the wrong GPU. Worth it."

r/homelab member

Building AI server

"Best VRAM calculator I've found. Shareable URLs are perfect for Reddit discussions."

Community contributor

Helping others daily

Common Questions

Do I need a credit card?

Nope! The 1-month trial is completely free, no credit card required. If you like it after the month, you can upgrade to Pro for $9.99/month.

What happens after the free month?

Your account reverts to the free tier. You'll still have access to basic tools, but premium features like the Inference Lab will require a Pro subscription.

Can I share my configurations on Reddit?

Absolutely! All our tools have shareable URLs. Just click the "Share Config" button and paste the link in your Reddit comments.

Is this actually built by the community?

Yes! We're active in r/LocalLLaMA and r/homelab. The tool was built to solve the exact problems we saw people asking about daily. We also accept benchmark contributions from the community.

Ready to Stop Guessing?

Join thousands of AI enthusiasts who already know what they can run

Start Your Free Month

No credit card • Cancel anytime • Full access to all tools

Questions? Find us on r/LocalLLaMA or r/homelab