It is very concerning how people are anthropomorphizing chat gpt. It will get a lot of people into trouble. The media is largely to blame for this. Never mind most gushing media stories about AI, even the few stories criticizing AI treat it as a human. Chat GPT is racist, they say, it is sexist or a liar.
Well it is neither of these things, because all of the above require consciousness and intent and it has none. It is not human, it is not any type of conscious being, do not treat it as such.
It sticks together sentences based on existing language scanned in from the internet and millions of other sources. What it says depends on what someone else said sometime ago on some random forum on the internet, or some book or some other source stored in an available database. It is also programmed to sound extremely sure of itself, unless you flat out say it is incorrect, in which case it will immediately admit fault and apologize. Thus, asking it if it is sure is pointless.
Let me tell you a less disastrous and quite a bit funnier story. A friend of mine used chat GPT for coding. My friend became really trustful of chat gpts coding prowess and asked it if it could just send him the code in file form, so he did not have to worry about copying and pasting which apparently screwed up the formatting somehow. Chat gpt helpfully told him that it could send the code to github, and my friend could download the files from there. My friend said, that is perfect.
So chatgpt gave him a github account name and said find the file it created for my friend was there. My friend looked but github said that account had long ago been closed. My friend tried variations of that account name with different capitalizations, etc., but found nothing.
He went back and complained to chat gpt. Chat gpt dutifully apologized and sent him another account name. He again spends time looking for the account and looking for variations, etc. Again the account has been closed.
This happened a couple of more times and in the end my friend gave up and complained to me. "Why is chat GPT doing this to me? Is it mocking me? Is it getting its kicks from sending me on random wild goose chases?".
I had to explain to him that no, chat gpt is not human, and it is not mocking him. What probably happened is someone on some forum asked someone else on that forum to provide him with code in files. The responder then offered to put the files on github and provided an account name. When my friend asked a similar question, chat gpt matched up the questions and provided a similar answer. When my friend said that a particular account did not work, chat gpt scoured the web for other people mentioning their github account names and provided some of those.
So whenever you use chat gpt, remember that is mostly a glorified search engine. It will spit out information it has found somewhere that it calculates as matching your question. Do not attribute intent, feelings or any type of conscience to it.
I think it's worth noting here that, without plugins active, ChatGPT doesn't 'find' anything - everything is just baked in as a single giant blob of vector data. That's why it has a specific date cutoff as to what it 'knows'.
It's because technological progress has outpaced our ability to process it, so we're like medieval peasants discovering a "Hello World" program and assuming the computer is literally greeting the world. Hopefully, people will learn to view LLMs as they really are before doing what the lawyer did en masse.
Well it is neither of these things, because all of the above require consciousness and intent and it has none. It is not human, it is not any type of conscious being, do not treat it as such.
It sticks together sentences based on existing language scanned in from the internet and millions of other sources. What it says depends on what someone else said sometime ago on some random forum on the internet, or some book or some other source stored in an available database. It is also programmed to sound extremely sure of itself, unless you flat out say it is incorrect, in which case it will immediately admit fault and apologize. Thus, asking it if it is sure is pointless.
Let me tell you a less disastrous and quite a bit funnier story. A friend of mine used chat GPT for coding. My friend became really trustful of chat gpts coding prowess and asked it if it could just send him the code in file form, so he did not have to worry about copying and pasting which apparently screwed up the formatting somehow. Chat gpt helpfully told him that it could send the code to github, and my friend could download the files from there. My friend said, that is perfect.
So chatgpt gave him a github account name and said find the file it created for my friend was there. My friend looked but github said that account had long ago been closed. My friend tried variations of that account name with different capitalizations, etc., but found nothing.
He went back and complained to chat gpt. Chat gpt dutifully apologized and sent him another account name. He again spends time looking for the account and looking for variations, etc. Again the account has been closed.
This happened a couple of more times and in the end my friend gave up and complained to me. "Why is chat GPT doing this to me? Is it mocking me? Is it getting its kicks from sending me on random wild goose chases?".
I had to explain to him that no, chat gpt is not human, and it is not mocking him. What probably happened is someone on some forum asked someone else on that forum to provide him with code in files. The responder then offered to put the files on github and provided an account name. When my friend asked a similar question, chat gpt matched up the questions and provided a similar answer. When my friend said that a particular account did not work, chat gpt scoured the web for other people mentioning their github account names and provided some of those.
So whenever you use chat gpt, remember that is mostly a glorified search engine. It will spit out information it has found somewhere that it calculates as matching your question. Do not attribute intent, feelings or any type of conscience to it.