Right: the one thing an LLM will never be able to do is stake their credibility on the quality or accuracy of their output.
I want another human to say "to the best of my knowledge this information is worth my time". Then if they waste my time I can pay them less attention in the future.
I want another human to say "to the best of my knowledge this information is worth my time". Then if they waste my time I can pay them less attention in the future.