Mustafa Suleyman: AI development won’t hit a wall anytime soon—here’s why
What Happened
We evolved for a linear world. If you walk for an hour, you cover a certain distance. Walk for two hours and you cover double that distance. This intuition served us well on the savannah. But it catastrophically fails when confronting AI and the core exponential trends at its heart. From the time I
Our Take
He's right, but so what? Exponential trends breaking linear intuition isn't exactly a revelation at this point. We've known scaling works—every LLM lab is built on this assumption.
The actual wall we're hitting? Training data exhaustion. Compute costs getting absurdly expensive. Model size curves flatten when you've vacuumed up the internet. That's the real constraint, not some theoretical plateau.
This feels like explaining exponentials to people who just discovered logarithms. True, but we moved past this debate in 2023.
What To Do
Stop talking about scaling walls and start modeling training data depletion curves.
Cited By
React