Meta's algorithm connected pedophiles and guided them to like-minded content.
A comprehensive investigation by the Wall Street Journal and the Stanford Internet Observatory reveals that Meta-owned Instagram has been home to an organised and massive network of peadophiles.
The WSJ highlights Instagram’s own algorithms were actively promoting pedophile content to other pedophiles, while the peadophiles themselves used coded emojis as signals, such as a picture of a map, or a slice of cheese pizza.
"Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests", the Journal and the academic researchers found.
The pedophilic accounts on Instagram mix brazenness with superficial efforts to veil their activity, researchers found. Certain emojis function as a kind of code, such as an image of a map—shorthand for “minor-attracted person”—or one of “cheese pizza,” which shares its initials with “child pornography,” according to Levine of UMass. Many declare themselves “lovers of the little things in life.” -WSJ
According to the researchers, Instagram allowed pedophiles to search for content with explicit hashtags such as #pedowhore and #preteensex, which were then used to connect them to accounts that advertise child-sex material for sale from users going under names such as “little slut for you.”
Sellers of child porn often convey the child’s purported age, saying they are “on chapter 14,” or “age 31,” with an emoji of a reverse arrow.
“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” said Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, adding that the company has far more effective tools to ‘map’ its pedophile network than outsiders do. “I hope the company reinvests in human investigators,” he added.
Researchers investigating the network set up test accounts within the pedophile network, which were immediately inundated with “suggested for you” recommendations of child-sex content, as well as accounts linking to off-platform trading sites. Instagram, through their algorithms were effectively encouraging peados to view more and more illigal content.
'Underage-sex-content creators and buyers are just a corner of a larger ecosystem devoted to sexualized child content. Other accounts in the pedophile community on Instagram aggregate pro-pedophilia memes, or discuss their access to children. Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions'. -WSJ
What’s more, Meta accounted for 85% of child pornography reports filed with the National Center for Missing & Exploited Children, according to the report. However, Meta has struggled with these efforts more than other platforms both because of weak enforcement and design features that promote content discovery of legal as well as illicit material, Stanford found.
“Instagram’s problem comes down to content-discovery features, the ways topics are recommended and how much the platform relies on search and links between accounts,” said David Thiel, chief technologist at the Stanford Internet Observatory. “You have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t.”
Meta acknowledged to the Journal that they had received a flood of reports of child sexual exploitation and failed to act on them – blaming a software glitch that prevented a substantial portion of user reports from being processed.
However, whilst Meta struggle with the banning of peadophilic content, they have no such trouble with the banning of news content on lockdowns, vaccine deaths, fake pandemics or climate change hoaxes. Odd that.