tech
April 8, 2026
Mustafa Suleyman: AI development won’t hit a wall anytime soon—here’s why
The compute explosion is the technological story of our time. And it is still only just beginning.

TL;DR
- AI training data and compute have grown by a factor of 1 trillion since 2010.
- Three key advances enable exponential growth: faster processors (e.g., Nvidia chips, Maia 200), high-bandwidth memory (HBM3), and interconnected GPUs forming warehouse-scale supercomputers (NVLink, InfiniBand).
- Training speed for language models has increased dramatically, from 167 minutes on eight GPUs in 2020 to under four minutes on modern hardware, a 50x improvement.
- Software advances are also crucial, with the compute required for a fixed performance level halving every eight months, making AI deployment radically cheaper.
- AI compute is projected to grow another 1,000x by the end of 2028, with significant new compute capacity coming online annually.
- This advancement is expected to lead to nearly human-level AI agents capable of complex tasks.
- Energy consumption is a challenge, but falling solar and battery costs offer a path to clean scaling.
- Massive AI infrastructure projects are underway globally, signaling a shift towards cognitive abundance.
Continue reading the original article