Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think it makes sense to call ChatGPT hallucinating when it returns wrong facts. Hallucinations imply that the protagonist can distinguish reality from something hallucinated. ChatGPT cannot distinguish facts from fiction.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: