Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
danielmarkbruce
on Jan 13, 2025
|
parent
|
context
|
favorite
| on:
AI founders will learn the bitter lesson
It's hilarious that people don't see this. The UX of an "llm product" is the quality of the text in text out. An "aligned model" is one with good UX. Instruct tuning is UX. RLHF is UX.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: