You can't average out the userbase like that because the individual usage of the service varies wildly, and advertising revenue is directly tied to amount of usage.
Especially because OpenAI highly inflates user figures.
> It's not orders of magnitudes more expensive
This too is skewed by averaging with users who barely use the service.
>You can't average out the userbase like that because the individual usage of the service varies wildly
Yes you can. This is how Meta, Google et al report their numbers. Obviously I'm not expecting each user to bring in exactly $8. The point is that the value they need to extract from their free users to be profitable is very small and very achievable. You and many people here have completely incorrect notions on how expensive inference is. Inference is cheap, and has been for some time now.
>and advertising revenue is directly tied to amount of usage.
Open AI with 800M weekly active users processes 2.6B messages per day. Google with ~5 billion users processes ~14 billion searches per day.
>This too is skewed by averaging with users who barely use the service.
No it's not. Inference is just not that expensive. Model costs have literally crashed several orders of magnitudes in the last few years. Sure, in 2020, this would be a very serious concern. In 2025, it just isn't.
My point is that for these purposes, users are not fungible. You can't just divide the cost-revenue equation by the amount of users N on both sides.
> No it's not.
If you add a pile of fictious users to the usercount, the apparent average cost-per-user drops as the fictious users do not use the service and do not add their own costs. This lowers the apparent amount of per-user revenue you need.
However, as fictious users also do not generate revenue, this is all smoke and mirrors.
>My point is that for these purposes, users are not fungible. You can't just divide the cost-revenue equation by the amount of users N on both sides.
Again, yes you can if you're simply trying to see the relative level of value you need to extract from your users. It's not a complicated idea. $8 is well below what Google, Meta report. You were wrong. They don't need to reach a high bar. End of story.
>If you add a pile of fictious users to the usercount, the apparent average cost-per-user drops as the fictious users do not use the service and do not add their own costs. This lowers the apparent amount of per-user revenue you need.
As always, nonsensical hypotheticals are just that. Nonsensical.
Not only can the users you're talking about not exist in reality, the numbers being thrown around are literally based on their Weekly active users.
There are multiple ways to do the computation. All of them will show LLMs having unit economics that are at least an order of magnitude better than search engines for the search engine use case[0]. Not multiple orders of magnitude worse like you claim. You're off by at least three orders of magnitude.
Ad-supported LLM Chatbots will be one of the most lucrative businesses ever.
Especially because OpenAI highly inflates user figures.
> It's not orders of magnitudes more expensive
This too is skewed by averaging with users who barely use the service.