Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Way cheaper? I thought that 1K Tokens (in+out) cost 0.04 USD in GPT-4 Turbo, which is roughly one larger chat response (2 screens). To reach parity with ChatGPT Plus pricing you need thus to use less than 500 such responses per month via API.

For GPT-4 the pricing is double that (0.09 USD per 1K). So only 200 larger interactions to reach 20 USD cost.

Or am I wrong?



It depends on your usage; for me the plus sub is much cheaper than if I use the api directly, but I use it a lot for everything I do.


In my experience, each message with the 1106 preview model costs me about $0.006, which is acceptable. Most importantly, the API is higher availability (no "you have reached your message limit") and I feel more comfortable using proprietary data with it, as data sent through the API won't be used to train the model.

Now, if the chat gets very long or is heavy on high-token strings (especially code), those costs can balloon up to the 9-12 cent region. I think this is because chatgpt-web loads all the prior messages in the chat into the context window, so if you create a new chat for each question you can lower costs substantially. Most often I don't need much prior context in my chat questions anyway, as I use ChatGPT more like StackOverflow than a conversation buddy.

Also, it's a lot easier to run company subscriptions this way, as we don't have to provision a new card for each person to use the web version. I believe there is an Enterprise version of ChatGPT, but chatgpt-web is functionally equivalent and I'm sure it costs less.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: