Major digital services have to deal with publishing large amounts of files on their platforms on a daily basis. Unfortunately, among all these materials published or stored on these services, there is also offensive, illegal and criminal content.
Google is one of the companies that needs to find ways to deal with these problems child sexual abuse materialAlso called the English abbreviation CSAM (child sexual abuse material).
To combat the publication and spread of such content online, the brand has a number of measures in place to prevent, detect, remove and report those deemed offensive. Below you can see some of the actions this giant has already taken to combat CSAM across its various products.
Google’s tools against child abuse on the internet
To combat child abuse content on its platforms, the company is implementing a mix of:state-of-the-art automatic detection tools” And human critics “specially educated“.
The first obstacle to publishing such material on a Google service is A suite of image or video detection mechanisms considered CSAM On YouTube and search engine.
Among these “artificial intelligence classifiers and hash matching“, which means comparing it with a unique identifier that attempts to match it with other material that has already been banned.
Secondly, if the material is indexed by the search engine, Google takes action to block targeted addresses for such materials”Appears to sexualize, threaten or abuse children“.
In addition to the results filter, if the search terms are similar to this topic, the service also enables an alert regarding illegality this type of content.
This message also includes: Step-by-step guide on how to report this content to expert bodies – In the Brazilian example, this is Safernet. In other company services, including Drive or Gmail, you can access the Help Center to find necessary guidance and open a support ticket.
Also company sends material to the National Center for Missing and Exploited Children (NCMEC in its original English abbreviation). This specialized body is responsible for sending the complaint to the authorities and bodies responsible for the investigation and fight against the relevant actors.
Google only from January to July 2024 Reported over 2.8 million pieces of content to NCMEC Containing material from the search engine and YouTube. This total is almost 600 thousand cases are of CyberTipline typeIt contains additional information about serious cases of abuse and the production of criminal content that needs to be made worse.
Google says this It also maintains partnerships with NGOs and child protection coalitions. This alliance aims to share the company’s technical experience and “child safety tool kit“.
Based on two APIs, API Content Security for images and CSAI Match In videos, this service analyzes millions of images and videos in its databases to make other platforms safer. Customers include big names in the industry such as Meta, Reddit and Adobe.
Finally, the company still maintains a Transparency Report detailing its efforts to combat CSAM, updated data, and courses of action.
Violated Reality 3: Sexual Predators
THE TecMundo will bring this truth to the big screen. documentary trailer Violated Reality 3: Sexual Predators is now available. Watch below:
Source: Tec Mundo
I am a passionate and hardworking journalist with an eye for detail. I specialize in the field of news reporting, and have been writing for Gadget Onus, a renowned online news site, since 2019. As the author of their Hot News section, I’m proud to be at the forefront of today’s headlines and current affairs.