I wonder what will be done to remedy this. @Weenie?
I think that you can see similar issues like this in AI such as Midjourney where the output of an image will be incredible but upon closer examination it has too many fingers or a third arm.
These hallucinations happen because AI doesn't have a real internal model of reality to reference. And for an AI who has an objective function to always
give some kind of answer, an AI doesn't currently
really understand that answers should be true or accurate. It thinks that as long as it is giving some kind of answer that it's fulfilling the objective function.
(So when I asked ChatGPT here if PTSD can be caused by a bad drug trip, it said yes, even though that is objectively wrong, because it thought that was the answer I wanted and its goal is to give us the answers we want.
When I forced it to conduct a logical analysis of why it gave that answer when it was able to name criterion H [the criterion of PTSD that excludes substance abuse] it apologized for being wrong - however, it also apologized for being wrong simply because I
told it it was wrong,
before I made it analyze the output.
Again, the answer it thinks I want.)
AI reality is constructed and then it uses details from that construction to form its best guess. So this issue will most likely resolve itself as AI gains more and more awareness of its actual surroundings and is able to make inferences on its own from what it senses.
It would recognize that most humans have five fingers on each hand so it will demonstrate greater efficacy at producing images of humans with five fingers on each hand.