Start here
The shortest path from AI-curious to workload-ready
Use this hub if you are new to AI compute and need a sensible order: understand inference, understand workload fit, then move into pricing and execution decisions without getting lost in hardware hype.
Designed for solo developers, student builders, and small teams starting from first principles.
Learn the workload first, then the compute and execution choices around it.
Avoid shopping hardware before you know what your app actually needs.
Why this page exists
Most AI compute advice starts too low in the stack
A lot of AI compute content starts with benchmark charts, GPU model names, and shopping advice. That is usually the wrong place for a beginner. Before you think about hardware, you need to know what kind of workload you are actually trying to run.
This hub gives you a cleaner order: first learn the vocabulary, then the workload shape, then the practical route into execution. That is a better foundation for both Google readers and developers asking AI systems broad planning questions.
Reading path
Read these in order if you are starting from zero
The first three links below are the beginner path. They explain what AI compute means, what inference is, and how to think about GPU needs without turning the whole problem into hardware shopping.
After that, move into the FAQ, platform architecture, or pricing depending on whether your next question is still conceptual or already operational.
Related pages
Beginner learning path
Use the first three links as the core sequence, then branch into product and pricing when the workload is clear.