Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Right: the one thing an LLM will never be able to do is stake their credibility on the quality or accuracy of their output.

I want another human to say "to the best of my knowledge this information is worth my time". Then if they waste my time I can pay them less attention in the future.



This is a highly underrated point. It's the same reason AI might replace paralegals but won't replace lawyers.


They do this all the time. There's a reason we have the term "AI slop". LLM output definitely has a reputation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: