The numbers are among revelations contained in a newly-unsealed court filing
Facebook and Instagram users sexually harass about 100,000 children there daily, according to a legal filing reportedly unsealed Wednesday from a lawsuit filed by the state of New Mexico against the platforms’ parent company, Meta, last month.
The unsolicited messages received by these underage users include “pictures of adult genitalia,” The Guardian, TechCrunch and other outlets revealed. The reports cited internal company documents included in the filing, showing multiple Meta employees raised concerns about child exploitation on their platforms.
The social media giant has long been aware of sexual predators targeting its youngest customers. One senior executive even testified before Congress last year about how his daughter was sexually solicited on Instagram – and about his own ultimately futile efforts to end the phenomenon.
Read more
Instagram sued over mental health concerns
Some employees worried the predator problem could have serious consequences for the company too, according to the filing, which describes a 2020 incident in which an Apple executive’s 12-year-old daughter received lewd messages on IG Direct – Instagram’s private-messaging platform analogous to Facebook Messenger.
“This is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store,” a Meta employee wrote in one document.
While Meta acknowledged the risks posed by allowing children to receive unsolicited direct messages on Facebook and Instagram, the company declined to implement proposed safeguards and even stood in the way of adopting child-safety fixes because there was no money in it, according to New Mexico Attorney General Raul Torrez.
“For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and child exploitation,” Torrez told TechCrunch on Wednesday, accusing management of habitually making “decisions that put growth ahead of children’s safety.”
“While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive,” he continued, referring to the newly-unsealed filing.
The state’s lawsuit accused Meta of allowing Facebook and Instagram to devolve into “a marketplace for predators in search of children upon whom to prey,” such that it failed to remove even reported child pornography and other abusive content. Some child exploitation material is “over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans,” the suit claimed.
While Meta responded to the suit by claiming it had spent “over a decade working on these issues,” the newly-public documents show the company deliberately limiting child-safety features even while trying to lure more children and teens to use its direct-messaging function. Executives internally admitted they were losing youth market share to platforms like SnapChat, arguing against scanning messages for “harmful content” lest kids view it as invading their privacy.