MouthBQ98 said:
I'm going to guess it didn't outright fabricate but rather found someone else's lie it dug up convenient to its purpose and ran with it. I am going to guess most AI don't attempt to sanity check or cross reference and validate source material for plausibility. That would be an very advanced reasoning task.
It's called a "hallucination" in the parlance of the AI people...
The interesting thing is that it was "partly right".
The document numbers on both were ALMOST correct. The last 2 digits represent the year the latest revision of them came out.
No revisions came out those years.
And the subject of the documents was "somewhat" related...
My point was to make sure you double check EVERYTHING. Because if it can make THAT hallucination, it could make a worse one....