Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doesn't seem like much of a hallucination then. Maybe messing with its system context would better fit the claim?


LLM "hallucination" is a pretty bullshit term to begin with




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: