You can reductionistically do the same to claim that the mesh of charged gel tubes in our brain is just spasming our muscles when humans type words in a computer.
Whether LLM are good or not, liars or not hardly depends on them being implemented on random black boxes algorithms becouse you could say the same of our brains.
The point is that the statement "LLMs should just cite their sources, what's the problem" is nonsensical, and the reason it's nonsense has to do with how LLMs actually work.
Citing sources is not a magic that makes what you say true, it just makes statement more easily falsifiable.
LLMs can cite sources as well as any human, that is with a non-trivial error rate.
LLMs are shit for a lot of things but the problems are with the quality of the output whether they work by magic, soul-bending, matrix multiplication, or whatever is irrelevant.
Whether LLM are good or not, liars or not hardly depends on them being implemented on random black boxes algorithms becouse you could say the same of our brains.