Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
xkcd1963
on May 27, 2023
|
parent
|
context
|
favorite
| on:
Lawyer cites fake cases invented by ChatGPT, judge...
I don't think it makes sense to call ChatGPT hallucinating when it returns wrong facts. Hallucinations imply that the protagonist can distinguish reality from something hallucinated. ChatGPT cannot distinguish facts from fiction.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: