

It’s frequency based. Voltage can’t be used to measure time in anyway I can’t think of.
It’s frequency based. Voltage can’t be used to measure time in anyway I can’t think of.
Entirely agree with that. Except to add that so is Dario Amodei.
I think it’s got potential, but the cost and the accuracy are two pieces that need to be addressed. DeepSeek is headed in the right direction, only because they didn’t have the insane dollars that Microsoft and Google throw at OpenAI and Anthropic respectively.
Even with massive efficiency gains, though, the hardware market is going to do well if we’re all running local models!
No, that’s the thing. There’s still significant expenditure to simply respond to a query. It’s not like Facebook where it costs $1 million to build it and $0.10/month for every additional user. It’s $1billion to build and $1 per query. There’s no recouping the cost at scale like previous tech innovation. The more use it gets, the more it costs to run, in a straight line, not asymptotically.
AI is a commodity but the big players are losing money for every query sent. Even at the $200/month subscription level.
Tech valuations are based on scaling. ARPU grows with every user added. It costs the same to serve 10 users vs 100 users, etc. ChatGPT, Gemini, copilot, Claude all cost more the more they’re used. That’s the bubble.
Theres openAI, google and meta (american), mistral (French), alibaba and deepseek (china). Many more smaller companies that either make their own models or further finetune specialized models from the big ones
Which ones are not actively spending an amount of money that scales directly with the number of users?
I’m talking about the general-purpose LLM AI bubble , wherein people are expected to return tremendous productivity improvements by using a LLM, thus justifying the obscene investment. Not ML as a whole. There’s a lot there, such as the work your colleagues are doing.
But it’s being treated as the equivalent of electricity, and it is not.
ChatGPT loses money on every query their premium subscribers submit. They lose money when people use copilot, which they resell to Microsoft. And it’s not like they’re going to make it up on volume - heavy users are significantly more costly.
This isn’t unique to ChatGPT.
Yes, it has its uses; no, it cannot continue in the way it has so far. Is it worth more than $200/month to you? Microsoft is tearing up datacenter deals. I don’t know what the future is, but this ain’t it.
ETA I think that management gets the most benefit, by far, and that’s why there’s so much talk about it. I recently needed to lead a meeting and spent some time building the deck with a LLM; took me 20 min to do something otherwise would have taken over an hour. When that is your job alongside responding to emails, it’s easy to see the draw. Of course, many of these people are in Bullshit Jobs.
I’ve seen it grow low. Like thyme.
Midwestern edition, made into squares with marshmallows perhaps?
I in no way can relate to this. In fact the microwave is the most visible clock in the house, given its location. As such it’s always set against if the power blinks out or whatever.