Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> But at the same time technofeudalism, dystopia, etc.

You could run LLMs locally to mitigate this. Of course, running large models like GLM-4.6 is not feasible for most people, but smaller models can run even on Macbooks and sometimes punch way above their weight



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: