A replica of what? Of your words? Your thoughts? How would it access those thoughts? What about your feelings? What about the things that actually go on in your physical body outside the brain?
Encoding the human experience will probably get much better over time. They're already experimenting with audio and video interfaces. I saw some other products that combine fitbit like sensors with an LLM.
I don't know what the results of combining all that data with an LLM will look like, but I think recording our experience to be used by an AI will only get better.
If it hears what you hear, hears what you say, reads what you write, sees what you see, notes what you do, it will end up knowing more about you than you yourself. The way you can interact with it will be the same way others can interact with it.
Also, sensors like HRMs or activity monitors could give it insights into your emotional state.