As recently as a few days ago, Sam Altman was still selling scaling as if it were  infinite, in  an interview with the CEO of Y Combinator. “When we

An AI rumor you won’t want to miss - by Gary Marcus

submited by
Style Pass
2024-11-10 22:00:05

As recently as a few days ago, Sam Altman was still selling scaling as if it were infinite, in an interview with the CEO of Y Combinator. “When we started the core beliefs were that deep learning works and it gets better with scale… predictably… A religious level belief … was…. that that wasnt’t gotten to stop. .. Then we got the scaling results … At some point you have to just look at the scaling laws and say we’re going to keep doing this… There was something really fundamental going on. We had discovered a new square in the periodic table”

But, as I have been saying since 2022’s “ Deep Learning is Hitting a Wall”, scaling laws” are not physical laws. They are merely empirical generalizations that held for a certain period time, when there was enough fresh data and compute. Crucially, there has never been any principled argument that they would solve hallucinations or reasoning, or edge cases in open-ended worlds —or that synthetic data would suffice indefinitely to keep feeding the machine.

As I wrote yesterday (and first noted back in April), I strongly suspect that, contra Altman, we have in fact reached a point of diminishing returns for pure scaling. Just adding data and compute and training longer worked miracles for a while, but those days may well be over.

Leave a Comment