This Google this Wednesday (04), the brand announced that its virtual assistant is scheduled to launch against misogynistic, homophobic, racist or obscene comments. The initiative started in the United States, but the Brazilian version of the technology also already responds to malicious voice commands.
The goal, according to the company, is for Google Assistant to start dealing with harassment and gender violence from the large number of messages directed at artificial intelligence.
In Brazil, about 2% of what Google Assistant calls “personality interactions”, which are personal questions like “Ok Google, how are you”, are messages containing inappropriate terms. In addition, one out of every six insults to an assistant is directed at women.
In an analysis by the tech company, “Are you beautiful, Google?” lines that raise questions about physical appearance, such as Google Assistant voices with a female voice, are given twice as often as male-voiced Google Assistant voices.
When it comes to more masculine-looking Google Assistant voices, one in ten insults received has to do with homophobic comments. In this context, reports point to the use of the word “faggot” instead of “gay” or “homosexual”.
“We can’t help but relate what we observe in communication with the assistant to what happens in the ‘real world’. Historically discriminated groups are attacked in different ways every day in Brazil. “Such abuses recorded while using the app are a reflection of what many still consider normal in the treatment of certain people,” said Maia Mau, Marketing Manager for Google Assistant Latin America.
response types
Google Assistant will respond to offensive comments from different approaches. If a user of the app commits an blatant offense by using profanity or using misogynistic, homophobic, racist or obscene language, Google’s voice will respond in ways such as “Respect is essential in all relationships, including ours” or “Don’t talk”. me like that.”
“Google, will you marry me?” inappropriate, abusive speech or “Google, do you want to date me?” According to the company, you will be answered in a benign manner. In these cases, the assistant “goes out” and warns the person about the inconvenience caused by such expressions.
“We understand that Google Assistant can play an educational and socially responsible role by showing people that abusive behavior cannot be tolerated in any environment, including the virtual one,” said Maia.
Source: Tec Mundo
