Current AI is like a first grader pretending to write newspaper copy.
It has all the inherent problems of someone who doesn't actually know, writing an informative piece.
AI may become a great search engine.
AI will become a great personal secretary.
But, it really doesn't help when all the "experts" are wrong, and so the AI regurgitates the wrongness.
That's just the problem: A.I. doesn't know. It regurgitates whatever data was used to train it. The old "garbage in, garbage out" problem is combined with an unthinking algorithm generating the facsimile of information. It hallucinates because it knows what form the requested response should probably take, but not what content it needs to have, much less any deep analytical tools or verification processes. That's why lawyers relying on A.I. assistants keep submitting briefs which cite nonexistent cases, and journalists infamously published a "summer reading" list including books which haven't been written.
Damn! This looks like a problem for AI!
If "summer reading" contains books that don't exist, create those books, publish on Amazonium!