Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ChatGPT (and all the competitors) are trivially sticky products: I have a lot of ongoing conversations in there, that I pick up all the time. Add more long term memory stuff — a direction I am sure they will keep pushing — and all of the sudden there is a lot of personal data that you rely on it having, that make the product better and that most people will never care to replicate/transfer. Just being the product that people use makes you the product that people will use. "the other app doesn't know me" is the moat. The data that people put in it is the moat.


This. I am not sure why or how this is missed, but because you cannot easily port context ( maybe yet ), the stickiness increases with every conversation assuming your questions are not encyclopedia type questions that don't need follow up.


Do you actually curate your contexts? I did in the early days, but I just create new ones now.


Hmm. You got me thinking. I rarely delete conversations, but I don't randomly engage either unless I am curious how llm will respond in a given scenario. For example, last time I was comparing how my output compared against some of the other online community. Maybe curate is too strong a word? Maybe I select for specific desired paths?


technically you can export any single conversation and try to continue them in another LLM…

but migration of all this personal knowledge / context en masse is not convenient.

and i’m sure openai won’t make it easy to escape the little labyrinth they’re building for us




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: