class=”sc-29f61514-0 fwWrRV”>
Whether it concerns destroyed houses, injured people or even dead people: images of the war in the Middle East are flooding the internet. But they don’t always show reality. Because: Artificial intelligence (AI) can now generate images that can hardly be distinguished from real photos. Some artists apparently make a profit from this.
As the “NZZ” reports, an image of a huge cloud of smoke circulated on the Internet. It is located above a city that looks very similar to Gaza. Several smaller online portals used the image.
Only: neither the city nor the cloud are real. An AI created the image. But none of the portals have pointed this out. This is despite the fact that the image was most likely obtained from the Adobe Stock image database. It clearly states that it was generated by AI.
No labels for AI images
However, images on Adobe Stock are not always correctly identified. As the “NZZ” discovered, numerous photographs by Iranian artist Meysam Azarneshin are for sale on the website. One shows two children running through the rubble. Nowhere does it appear that the image was created by AI. If you enter ‘Gaza’ into your search, the image will be one of the first results. And this despite the fact that there is nothing about Gaza in the description of the photo: “Two homeless girls walk in a destroyed city, soldiers, helicopters and tanks are still attacking the city.”
Another photo shows a soldier carrying a child in his arms. A cloud of smoke can be seen in the background. This image is for sale at the Alamy photo agency, where it is displayed under the keywords “Palestine Orphan”. As the “NZZ” reports, citing the agency, the image is a “composite”. Means: It is not an AI creation, but a composition of several images.
Agencies deny guilt
Alamy told the newspaper that AI images will be removed from the site. However, Azarneshin’s image is not a pure AI work and is therefore tolerated. Alamy did not comment on the problem that such false images could cause in relation to the war. The agency sells press photos, among other things, which can lead to media companies spreading fake news.
Adobe Stock was also approached by the newspaper. The image database said all AI images on their website must be labeled. It is unclear why Adobe is not enforcing its policy in the artist’s case. Adobe has not addressed this yet. But one thing is certain: the images mentioned above are not the only ones that look like AI but do not have the corresponding label.
The artist himself did not respond to the newspaper’s request. Apparently, for him, doing business with fake images is profitable. Because: Creating images with AI is less time-consuming than creating and then editing them yourself. At the same time, AI-generated images on Adobe Stock should generate more money, on average, about four times as much as real images, according to the paper.
The fact that the images are distributed on the sites without being declared causes a variety of problems. The images help to obscure reality and can contribute to the spread of fake news. Moreover, they are created from real images that are used to train the AI. This means that photographers who risk their lives to take pictures of the war are being exploited. (Mrs)
Source: Blick

I am Amelia James, a passionate journalist with a deep-rooted interest in current affairs. I have more than five years of experience in the media industry, working both as an author and editor for 24 Instant News. My main focus lies in international news, particularly regional conflicts and political issues around the world.