Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

At least part of is is that the capex for LLM training is so high. It used to be that compute was extremely cheap compared to staff, but that's no longer the case for large model training.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: