15 …360% Annual Growth Over Fifteen Years of… Compute to Train AI Models Led To… *A FLOP (floating point operation) is a basic unit of computation used to measure processing power, representing a single arithmetic calculation involving decimal numbers. In AI, total FLOPs are often used to estimate the computational cost of training or running a model. Note: Only language models shown (per Epoch AI, includes state of the art improvement on a recognized benchmark, >1K citations, historically relevant, with significant use). Source: Epoch AI (5/25) Training Compute – FLOP* Grok 3 +360% / Year AI Technology Compounding = Numbers Behind The Momentum Training Compute (FLOP) for Key AI Models – 1950-2025, per Epoch AI
