Right now, if you type “Israel Kamakawiwoole” into Google search, you won’t see any of the singer’s famous album covers or a photo of him performing one of his songs on his iconic ukulele. The first thing you see is an image of a man sitting on a beach with a smile on his face, not a photograph of the man himself taken on camera. This is a fake photo created by artificial intelligence. In fact, when you click on the image, you are taken to the Midjourney subreddit where the series of images was originally posted.
This was first published Ethan Mollick in the movie “X” (formerly known as Twitter), a Wharton professor who studies artificial intelligence.
Taking a closer look at the photograph, it is not difficult to see all the traces of AI left on it. The fake depth of field effect is unevenly applied, the texture of his shirt is confusing, and of course, he’s missing a finger on his left hand. But all this is not surprising. As good as AI-generated images have gotten over the past year, they’re still pretty easy to spot if you look closely.
The real problem, however, is that these images appear as the first work of a well-known and well-known figure, with no watermark or indication that they were created by artificial intelligence. Google has never guaranteed the authenticity of image search results, but there is something that seems Very worrying about it.
There are several possible explanations for why this happened in this particular case. The Hawaiian singer popularly known as Iz passed away in 1997, and Google always wants to provide users with the latest information. But since there haven’t been many new articles or discussions about “From” since then, it’s not hard to see why algorithm picked this up. And while this doesn’t seem particularly important to Izu, it’s not hard to imagine examples that would be much more problematic.
Even if we don’t see this happening on a large scale in search results, this is a great example of why Google needs to have rules about this. At the very least, it seems like the AI-generated images need to be clearly labeled somehow before things get out of hand. At least give us the ability to automatically filter AI images. However, given Google’s own interest in AI-generated content, there is reason to believe that the company may want to find ways voltage The content in your results is generated by artificial intelligence and is not clearly labeled.
Source: Digital Trends

I am Garth Carter and I work at Gadget Onus. I have specialized in writing for the Hot News section, focusing on topics that are trending and highly relevant to readers. My passion is to present news stories accurately, in an engaging manner that captures the attention of my audience.