Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

in order to tell LLM to "do better", someone (a human) needs to know that it can be done better, and also be able to decide what is better.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: