Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe we can train a simpler model to come up with the correct if/else-statements for the prompt. Like a tug boat.


Hobbyists (random dudes who use LLM models to roleplay locally) have already figured out how to "soft-prompt".

This is when you use ML to optimize an embedding vector to serve as your system prompt instead of guessing and writing it out by hand like a caveman.

Don't know why the big cloud LLM providers don't do this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: