Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A bunch of this week’s OpenAI announcements address monetization, actually.


If they charged users what it actually costs to run their service, almost nobody would use it.


It's this and it's really funny to see users here argue about how the revenue is really good and what not.

OpenAI is only alive because it's heavily subsidizing the actual cost of the service they provide using investor money. The moment investor money dries up, or the tech industry stops trading money to artificially pump the market or people realize they've hit a dead end it crashes and burns with the intensity of a large bomb.


> The moment investor money dries up, or the tech industry stops trading money to artificially pump the market or people realize they've hit a dead end it crashes and burns with the intensity of a large bomb.

You have hit the nail on the coffin To me, it is natural for investor money to dry up as nobody should believe that things would always go the right way yet it seems that openAI and many other are just on the edge... so really its a matter of when and not if

So in essense this is a time bomb, tick tock, the time starts now and they might be desperate because of it as the article notes.


Inevitably it will get jammed with ads until barely profitable. Instead of being able to just cut and paste the output into your term paper, you're going to have to comb through it to remove all the instances of "Mountain Dew is for me and you!" from the output.


Open weights models exist too and there is no moat if OpenAi does this but at this point I am not sure, maybe people wouldn't be able to figure out anything AI related except chatgpt but for most people it can definitely just be to use lets say any other provider which isn't enshittened if that ever becomes true.

Would also lose them the api business but i assume you are saying that they would have good ad free models on the api and ad riddled models in free tier

The funny thing is that maybe we already have it but its just more subtler who knows, food for thought :)


I thought it was starting when Ilya said that scaling has plateaued about a year ago. Now confirmed with GPT-5. Now they'll need to sell a pivot from AGI to productization of what they already have with a valuation that implies reaching AGI?


Reminds me of MoviePass or early stage Uber. Everything is revolutionary and amazing when VCs are footing the bill. Once you have to contend with market pricing things tend to change.


Moviepass had to close, but Uber's running a profit these days. The days of $1 Ubers was clearly unsustainable but unless you're inside OpenAI/Anthropic, we're all just guessing as to how much inference costs them to run. Some people have more detailed analysis than others. Most aren't quite so rude and angry as Mr Zitron though, preferring to let their work speak for itself rather than try and get you to go along with their analysis because he's telling you what you want to hear while yelling at you.


i dont believe this for a second. Inference margins are huge, if they stopped R&D tomorrow they would be making an incredible amount of money, but they cant stop investing because they have competitors.

its all pretty simple


Do we have any evidence of this that inference costs are less than the subscription price?


No, and nobody claiming this seems to be posting evidence when confronted.

I guess we'll find out in 2-3 years.


I don't know if the site is just broken for me at the moment, but this has used to track how much people were costing the companies (based on tokens per $ if I remember correctly): https://www.viberank.app/

One user in particular ran up a $50k bill for 1 month of usage, while paying $200 for the month: https://bsky.app/profile/edzitron.com/post/3lwimmfvjds2m

Plenty of people are to blow through resources pretty quickly, especially when you have non-deterministic output and have to hit "retry" a few times, or back-and-forth with the model until you get what you want, whereby each request adds to total tokens used in the interaction.

AI companies have been trying to clamp down but so far unsuccessful, and it may never be completely possible without alienating all of their users.


This is unrelated to the original assertion: "If they charged users what it actually costs to run their service, almost nobody would use it."

5 million paying customers on 800 million overall active users is an absolutely abysmal conversion rate. And that's counting the bulk deals with extreme discounts (like 2.50 USD/month/seat) which can only be profitable if a significant number of those seats never use ChatGPT at all.


youre assuming that if chatgpt suddenly cost $20 a month to access at all that 795 million people would never talk to chatgpt again?


They won’t just charge users. They will charge brands and retailers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: