Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apparently you can avoid hallucination by basically reading the model’s mind instead of asking it questions.

https://arxiv.org/abs/2212.03827

Raises some ethical issues…



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: