Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's nothing weird about so called hallucination ("confabulation" would be a better term), it's the expected behavior. If your use case cannot deal with it, it's not a good use case for these models.

And yes, if you thought this means these models are being commonly misapplied, you'd be correct. This will continue until the bubble bursts.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: