Instagram has a serious problem with child pornography and Meta has already created a task force to try to eradicate it. The latest report from Stanford University that reveals that there are large groups of sexual content of minors on the platform has set off all the alarms in the company led by Mark Zuckerberg.
The conclusions of the Stanford University study are impressive: content related to sexual abuse of minors is rampant through social networks. It happens on Instagram, but the situation is even more serious in private network groups like Telegram and Discord. There are also photos and videos used by pedophiles on Twitter, where Musk has been removing barriers while laying off staff from the content moderation team.
Given the seriousness of the situation, Meta has decided to create an investigation team to prevent the dissemination of this type of material. The company wants to know how and who disseminates and sells content through its platform that violates the rights of minors.
The report reveals that several groups have been created through Instagram that openly advertise self-generated child sexual abuse material for sale. A market for this illegal material has been created through direct messages on the social network. Stanford researchers have found that the app’s algorithm has an unintended effect: These ads are more effective through the app’s built-in chat.
Stanford’s conclusion is compelling: “Due to the widespread use of hashtags, the relatively long life of seller accounts, and most of all, the powerful recommendation algorithm, Instagram serves as a key discovery mechanism for this specific community of buyers. and vendorsâ€, they explain.
On the other hand, Stanford research has indicated that the total size of the seller network on Instagram ranges between 500 and 1,000 accounts at the time these groups were found. As they explain, they began to investigate following a tip from the Wall Street Journal, which first reported on the findings.
From Meta, his spokesman Andy Stone has pledged to fight against this phenomenon: “Child exploitation is a horrible crime. We work fiercely to combat it on and off our platforms, and to support law enforcement in their efforts to apprehend and prosecute the criminals behind it,†he explains. The company says it had taken down 27 abusive networks between 2020 and 2022, and in January it disabled nearly half a million accounts for violating its child safety policies.