Meta Launches Investigation into Instagram's Promotion of Child Sexual Abuse Material


TEHRAN (Tasnim) – Technology company Meta has initiated an investigation into reports suggesting that Instagram's algorithm is promoting child sexual abuse material.

Facebook's parent company formed a task force to examine the allegations after the Stanford Internet Observatory (SIO) revealed the existence of "large-scale communities" sharing pedophilia content on the platform.

The SIO, prompted by a tip from the Wall Street Journal, uncovered instances of child sexual abuse material (CSAM) on Instagram. The journal's report, published on Wednesday, outlined how Instagram's recommendation algorithm facilitated connections within a vast network of illegal material sellers and buyers.

According to the SIO, Instagram's "suggested for you" feature also directed users to off-platform content sites. The SIO referred to Instagram as the "most important platform" for these networks.

In a blog post, Stanford's Cyber Policy Center stated, "Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers. Instagram's popularity and user-friendly interface make it a preferred option for these activities."

Instagram users were able to discover child abuse content through explicit hashtags that Instagram has now blocked.

A spokesperson from Meta commented, "Child exploitation is a horrific crime. We're continuously investigating ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them."

Meta revealed that it has already dismantled 27 pedophile networks on Instagram over the past two years and removed 490,000 accounts violating child safety policies in January alone.

The SIO also identified other social media platforms hosting similar content, although to a lesser extent. The organization called for a collective industry effort to restrict the production, discovery, and distribution of CSAM, and urged companies to allocate more resources towards proactively identifying and preventing abuse.

"Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures, and methods for identifying buyers," stated the SIO.

The organization hopes that its research will assist both the industry and non-profit organizations in their endeavors to eradicate child sexual abuse material from the internet.