Instagram helps promote an important network of users linked to pedophilia. According to the investigation Wall Street MagazineStanford University and the University of Massachusetts Amherst, Social Networking Algorithms link consumers and sellers of child pornography through recommendations. The platform also allowed until recently to use explicit labels of pedophilia.
Investigators found that Instagram, owned by Meta, allowed hashtags like #preteensex (“teenage sex”). They are linked to accounts promoting the sale of child sexual material. Profiles often claim to be run by the children themselves.
The Stanford Internet Observatory used similar tags to search 405 sellers from what the researchers called “self-generated” child sex material, or from accounts allegedly run by the children themselves. Of these merchant accounts, 112 had 22,000 unique followers. Some profiles said they belonged to children as young as 12.
The researchers also created test users to test the performance of the algorithms. All they had to do was access one of these accounts and they immediately received suggestions about alleged sellers and buyers of child pornography. They followed a few of these recommendations with one of the test accounts, and it was filled with content that sexualizes minors.
These Instagram pedophile networks often post a “menu” of content. Some offer customers to order specific numbers. They also include prices for videos of children hurting themselves. Others offer face-to-face “meetings.”
Hundreds of thousands of Instagram accounts about pedophilia
“That a team of three restricted scientists could find such a large network should raise alarm at Meta,” said Alex Stamos, head of the Stanford Internet Observatory and director of security for Meta through 2018. Wall Street Magazine. He explained that the company has much more effective tools to identify its pedophile group than outsiders. “I hope the company reinvests in human researchers,” he added.
In response to the investigation, Meta admitted that it had problems regulating this type of illegal material and created an internal task force. He also said that over the past two years he had eliminated 27 networks are linked to pedophilia and are organizing new takeovers.
Instagram, also as a consequence of this study, blocked thousands hashtags that sexualize children, some with millions of posts. Additionally, he explained that he limited his suggestion systems to users searching for terms related to child sexual abuse.
The sale of child pornography is just one manifestation of a much larger network. Other Instagram accounts, for example, regularly post memes in favor of paedophilia.
Meta employees who have worked on Instagram’s child safety initiatives estimate the number of users following this content to be in the hundreds of thousands, if not millions. representative Meta said that in January of this year alone, they deleted 490,000 accounts for violating child safety policies.
However, Instagram has allowed users to search for terms that its own algorithms know may be associated with this type of illegal content. In these cases, a pop-up window warns: “These results may contain images of child sexual abuse.” The screen offered two options: “Get Resources” and “See Results Anyway”. Instagram has already removed this last option but hasn’t explained why it exists.
Twitter, TikTok and other networks
The Stanford University team found 128 accounts offering to sell child sexual abuse material on Twitter. This is less than a third of the users associated with pedophilia on Instagram. The researchers explained that Twitter discouraged pedophile accounts to the same extent as Instagram and removed them much faster.
TikTok is a network where “this type of content does not appear to be shared,” according to the study report. Snapchat is used primarily for direct messaging, which is not conducive to networking.
“The most important platform for these networks of buyers and sellers is Instagram,” they stressed in the report. This social network is one of the largest in the world with over 1.3 billion users.
The National Center for Missing and Exploited Children, a US non-profit organization, received in 2022 31 million reports of child pornography. This is 47% more than was registered two years earlier. 85% of child pornography reports received by the center involved the Meta. Of the total 5 million complaints were on Instagram.
The researchers also warned that Instagram is often a conduit for other places on the internet where there is more overt child sexual abuse. The social network said that, according to its internal statistics, users see child exploitation in less than one in every 10,000 posts viewed.
EU calls for urgent action
The European Commission on Thursday demanded that Meta CEO Mark Zuckerberg “act immediately” to protect kids on Instagram. “The Meta Voluntary Child Protection Code does not seem to be working. Now Mark Zuckerberg must explain himself and act immediately,” European Commissioner for the Internal Market Thierry Breton tweeted.
The official said he plans to meet with Zuckerberg on June 23 at the Meta headquarters in Merlo Park (California, USA). Instagram, like other tech companies, took first place in April. “very large online platform”, in accordance with the provisions of the new European Union Digital Services Act. This means that it is now obligated to communicate key aspects of its internal activities with greater transparency and to guarantee safeguards for its users.
These “large platforms” have since four months to fulfill the obligations of the law, including the first annual risk assessment. “After August 25, under the Digital Services Act, Meta will have to show that it is taking action or face severe sanctions,” Breton recalled.
The new regulation establishes fines up to 6% of global turnover companies. And, as a last resort, temporary interruption of service.
Source: Hiper Textual
I’m Ben Stock, a highly experienced and passionate journalist with a career in the news industry spanning more than 10 years. I specialize in writing content for websites, including researching and interviewing sources to produce engaging articles. My current role is as an author at Gadget Onus, where I mainly cover the mobile section.