Although artificial intelligence (AI) interacts with you without requiring any manual intervention, it is human work. Technology currently becoming popular in productive models, are the results of training in a giant databaseIt’s all about trying to guess the best solution for a given situation.

To recognize text, images, landscapes, and other elements, AIs need to be fed a wealth of data from a wide variety of sources, mostly humans. An AI will only understand what a football is when it receives an infinite number of images of footballs from a wide variety of angles.

inevitably This information has some impact on the efficiency of the model.sometimes creates racial prejudices – something known as: algorithmic racism.

Algorithmic racism is when software or service itself reproduces widespread social discriminations. (Image: GettyImages)

What is algorithmic racism?

Algorithmic racism occurs when any software reproduces some form of discrimination against users; This can happen in different ways.

A popular example is photography: image processing software is constantly improving. better cope with differences in skin tone. Better representation of black skin tends to be an unusual feature in high-end mobile phones, and was one of the prominent improvements in Pixel mobile phones, for example.

Google, the world's most used search engine, has reproduced algorithmic racism in different contexts over the years. (Image: GettyImages)
Google, the world’s most used search engine, has reproduced algorithmic racism in different contexts over the years. (Image: GettyImages)

But there are more serious cases: during the rollout of Google Photos in 2015, the platform’s search engine presented photos of black people in response to the search for “Gorillas”.

In another search engine discussion, there is a search for “beautiful braids” and “ugly braids”. When it comes to news in 2019 The first search showed results for images of white women, the latter consisted mostly of black women.

The reproduction of racial and gender stereotypes occurs more subtly in generative AI: If you want visuals of a manager, they usually the result will be a photo of a white person. To have illustrations of black people, it is sometimes necessary to mention this feature in the prompt.

Why might AIs exhibit algorithmic racism?

Artificial intelligence is the result of an almost gigantic database. but it’s not universal — websites, books, social networks, messaging, audio, photos and much more. These early structures used for education, which are limited in nature, may include racial or gender biases that are present in societies around the world.

While a chatbot can suggest the use of glue to give the cheese on a pizza a better grip, it can also can reproduce racist stereotypes due to the influence of the database.

Scandalous cases like Microsoft’s Tay bot, which started spewing fascist phrases in less than 24 hours, don’t happen very often. However, existing models are not free from biases and especially stereotypes.

Chatbots, powered by massive databases and social networks, can reproduce racial prejudices in some conversations. (Image: GettyImages)
Chatbots, powered by massive databases and social networks, can reproduce racial prejudices in some conversations. (Image: GettyImages)

Now, when it becomes too controversial, the reproduction of offensive patterns is covered up with artificial barriers. If any request is clearly racially motivated, It is possible that the AI ​​refuses to respond to prompts.

THE There is also exaggeration and even misuse of other ethnicities. One indication of this failure was that Gemini created images of black people when he received a prompt asking for images of Nazi soldiers.

Right now, Reproduction of biases in AIs is subject of studyespecially given the importance of generative models. In Brazil the subject is discussed in books such as: algorithmic racismWritten by Tarcízio Silva, who holds a master’s degree in Contemporary Communication and Culture from UFBA and a PhD in Humanities and Social Sciences from UFABC.

Source: Tec Mundo

Previous articleA drug for joints instead of surgery was developed in Russia. November 19, 2024, 22:25.
Next articleLG QNED85TSA 65”: is the ‘basic premium’ smart TV worth buying? [Review]
I'm Blaine Morgan, an experienced journalist and writer with over 8 years of experience in the tech industry. My expertise lies in writing about technology news and trends, covering everything from cutting-edge gadgets to emerging software developments. I've written for several leading publications including Gadget Onus where I am an author.

LEAVE A REPLY

Please enter your comment!
Please enter your name here