> Animal brains such as our own have evolved to compress information about our world to aide in survival.
Key question is what are the "selection pressures" that drive the "evolution" of LLMs? In the case of robotics, there's a "survival of task completion" which usually has some physical goal, like assembling a part correctly or scoring a goal on a soccer field. One of the selection pressures driving LLM evolution is that the dual of always answering with something AND continuing the conversation (engagement). You can imagine how those two selection pressures yield outcomes that don't represent the world in a "real" sense.
Key question is what are the "selection pressures" that drive the "evolution" of LLMs? In the case of robotics, there's a "survival of task completion" which usually has some physical goal, like assembling a part correctly or scoring a goal on a soccer field. One of the selection pressures driving LLM evolution is that the dual of always answering with something AND continuing the conversation (engagement). You can imagine how those two selection pressures yield outcomes that don't represent the world in a "real" sense.