Israel-Hamas War: This viral image of a baby trapped under rubble turned out to be fake
The war between Israel and Hamas has generated so much false or misleading information online that even fact-checkers and analysts are having a tough time keeping up. Social media is flooded with misleading posts and artificial intelligence is making things worse.
Since the start of the war on 7 October, the majority of disinformation concerned outdated videos and images attributed to the conflict between Israel and Hamas.
But now, we’ve also started seeing images where the origin is impossible to trace. That’s because they’ve been generated using artificial intelligence (AI).
One of these AI-generated images that has spread online is of a baby partly buried under rubble.
Although many social media users have expressed doubts about its authenticity, a lot of people still fell for it, including a journalist based in Gaza.
How can you tell whether an image is AI-generated? The first thing that is very noticeable is the overly exaggerated facial expression, especially on the chin and the forehead of the child.
When looking closely, it appears the baby has extra fingers. That’s because even though AI is getting better at creating human-like images, it still has trouble reproducing certain body parts, including hands and feet.
The Cube also found earlier posts of this AI-generated picture. It had already appeared on social media back in February 2023 after earthquakes hit Turkey and Syria.
The AI-generated baby has been used by demonstrators in support of Palestine around the world including in Cairo, Egypt recently.
Recently, the French newspaper Liberation came under fire for featuring a protester holding up a banner with the AI-generated child on its front cover.
Multiple social media users said the newspaper spread fake news by not explicitly stating the protester was holding an AI-generated image.
“Many of the signs brandished in the demonstrations of October 17 were in fact generated by AI, which in recent months has become the artistic basis of protests, as previously were puppets, dolls or skeletons,” hit back the editor-in-chief of Liberation, Dov Alfon.
Should it have been necessary to at least mention in the caption of the photo that the protester is brandishing an AI image? “Probably yes,” said Dov Alfon, “but the Associated Press caption didn’t mention it, either through omission or lack of time, and neither did we. This is obviously regrettable.”
Although this particular AI-generated photo is quite easy to detect, technological progress is leading to increasingly credible results
You may remember the viral image of the Pope in a very stylish puffer jacket which fooled journalists back in March of this year.
Currently, platforms that generate AI images such as Dall-E and Stable Diffusion have put in place restrictions on graphic and political imagery which has helped to slow down the production of war-related disinformation.
Read the full article Here