Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’ve said this many times: stop using ChatGPT as a database. It does not contain facts.

It may appear to contain some facts. Some may also be actually true.

The truly useful usecase is as a reasoning engine. You can paste in a document and ask some questions about the facts in that document. Then it does a much better job, enough to be actually useful.



To some extent it does contain facts, but those facts are indistinguishable from non-facts, are just a small proportion of the dataset compared to everything else are indistinguishable from non factual information.

E.g. using text-davinci-003 (this is GPT3, not ChatGPT), "The moon is made of" completes to: Cheese 48.74%, rock: 31.66%, green 4.09% (98.75% followed by cheese), rocks 3.86%, and several other lower percentage tokens.

I wonder if there eventually will be a type of model that incorporates the ability to simultaneously do text completion while adhering to facts at the model level (rather than having to bolt it on top via context).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: