Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You are working against LLM attention. A LLM looks at a conversation and focuses on its attention points. Usually the start and end. Your previous work falls into the out of attention space and gets nuked.

If your asking how to have everything attention we currently can't.



Damn...

So you're saying I need some adderral.ai




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: