The chipmaker, now the most valuable public company in the world, said strong demand for its chips should continue this quarter.
The chipmaker, now the most valuable public company in the world, said strong demand for its chips should continue this quarter.
The power usage is massively overstated, and a meme perpetuated by Altman so he’ll get more more money for ‘scaling’. And he’s lying through his teeth: there literally isn’t enough silicon capacity in the world for that stupid idea.
GPT-5 is already proof scaling with no innovation doesn’t work. So are open source models trained/running on peanuts nipping at its heels.
And tech in the pipe like bitnet is coming to disrupt that even more; the future is small, specialized, augmented models, mostly running locally on your phone/PC because it’s so cheap and low power.
There’s tons of stuff to worry about over LLMs and other generative ML, but future power usage isn’t one.
Except none of these companies are making money. Like almost literally none. We’re about three years into the LLM craze, and nobody has figured out how to turn a profit. Hell, forget profit, not bleeding through prodigious piles of cash would be a big deal.
Nods vigorously.
The future of LLMs basically unprofitable for the actual AI companies. We are in a hell of a bubble, which I can’t wait to pop so I can pick up a liquidation GPU (or at least rent one for cheap).
That doesn’t mean power usage is an existential issue. In fact, it seems like the sheer inefficiency of OpenAI/Grok and such are nails in their coffins.
Power usage is what’s sucking the cash. What else could it be? Not all of these companies are building out lots of datacenters the way OpenAI is. They built what they have, and are now trying to make money on it.
The companies that are charging for AI are charging about as much as buyers are willing to pay, but it’s orders of magnitude too small to cover their costs. The big cost is power usage.