The value I get for that $20/month is astonishing. It's by far the best discretionary subscription I've ever had.
That scares me. I hate moats and actively want out. Running the uncensored 70B parameter Llama 2 model on my MacBook is great, but it's just not a competitive enough general intelligence to entirely substitute for GPT-4 yet. I think our community will get there, but the surrounding water is deepening, and I'm nervous...
tentatively called “Claude-Next” — that is 10 times more capable than today’s most powerful AI, according to a 2023 investor deck TechCrunch obtained earlier this year.
this is the thing that scare me.
when do these models stop getting smarter? or at least slow down?
I have been struggling to understand the use case. Would you still be using it to ask questions and receive answers, you get the benefit of not having to go to their website and being able to use the GPT-4 model, or is there another use case for the API to return specific data?
Your comment reminds me a lot of how I felt when I was given my first email address in 1991 and didn't have anyone to send email to. Little did we all know how important email would become. =)
I'm forming a new company. I was asked for a single sentence to describe the company as well as a longer paragraph. I wrote the single sentence and then fed it into chat.openai.com ChatGPT 3.5 (before I paid) and asked for a longer version.
What it came up with was a bit too heavy on the adjectives (the tone was a bit too much marketing), but wow, it nailed it in terms general concepts about what I'm building. This was all without knowing anything about the business. I can easily edit it back down to an easier tone for people to digest.
When I fed the same input into ChatGPT4, the results were 100x better.
I also like to use it to summarize my thoughts. I can write down a bunch of unfiltered gibberish about how I'm feeling today and what I'm thinking. Feed it in and it'll give me a great summary of what I just said in a non-threatening and neutral tone. Having gone through a lot of professional therapy (and even being married to one in a past life), it feels a lot like that to me... except a lot less expensive.
Unless you're an extremely heavy user, it's cheaper to just use the API. I've been tempted to do that, but OpenAI doesn't have a free trial for me to see the quality of GPT-4 first.
I'm astonished how often this comes up and also how wrong it is.
The cost of the GPT-4 API is ballpark around $0.05 / 1000 tokens. If you want to include a rolling context window which you basically HAVE TO DO if you want to maintain a persistent conversation, you will easily meet or exceed 1000+ tokens.
ChatGPT Pro gives you 50 GPT-4 queries every three hours. If you're using it all day you might average about 100 daily queries. Using a dedicated GPT4 API would run you approximately five dollars a day for the same thing - that's $150 a month as opposed to flat cost of $20.
I think you're being really wasteful with that kind of context window which is why it's a bit apples and oranges. I keep stats on this, my average message is around 50 tokens, the average individual response is around 200 tokens, and my average conversion length is 1.2 (only counting my messages). 50/convos/day * 30 days is $21 and I don't come close to that usage. Hell most of the time I don't even turn on GPT-4 because 3.5-instruct is plenty good.
Right but the API is so unbelievably cheap in comparison. I couldn't spend $20 if I tried using it constantly. My bill is a few bucks every month and you don't have to deal with "As a large language model…"
I don't know if it's entirely uncensored but it doesn't have the moderation API in front of it or the other manual tweaks OpenAI added to ChatGPT so largely it will just do whatever you ask it to without paragraphs of disclaimers.