Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How does the cost compare though? From my understanding o3 is pretty expensive to run. Is GPT-5 less costly? If so if the performance is close to o3 but cheaper, then it may still be a good improvement.


I find it strange that GPT-5 is cheaper than GPT-4.1 in input token and is only slightly more expensive in output token. Is it marketing or actually reflecting the underlying compute resources?


Very likely to be an actual reflection. That's probably their real achievement here and the key reason why they are actually publishing it as GPT-5. More or less the best or near to it on everything while being one model, substantially cheaper than the competition.


But it can’t do audio in/out or image out. Feels like an architectural step back.


My understanding is that image output is pretty separate and if it doesn’t seem that way, they’re just abstracting several models into one name


Maybe with the router mechanism (to mini or standard) they estimate the average cost will be a lot lower for chatgpt because the capable model won’t be answering dumb questions and then they pass that on to devs?


I think the router applies to chatgpt app. The developer APIs expose manual control to select the specific model and level of reasoning.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: