Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Where is my model I can run local and off line?

That's when the LLM stuff is going to take off for me.



Haven't you checked out Ollama, yet?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: