The New Mexico attorney general said Tuesday that internal documents from Snapchat released as part of an ongoing lawsuit show how the platform failed to manage and notify users about sextortion on the platform.
The suit, which Attorney General Raúl Torrez originally filed in September, includes the text of some of Snap’s internal correspondence. He claims the correspondence, which NBC News has not seen firsthand, shows the company “ignoring reports of sextortion, failing to implement verifiable age-verification, admitting to features that connect minors with adults.”
“It is disheartening to see that Snap employees have raised many red flags that have continued to be ignored by executives,” Torrez said in a statement. “What is even more disturbing is that unredacted information shows that the addicting features on Snapchat were blatantly acknowledged and encouraged to remain active on the platform. I will always work to hold companies, like Snap and Meta, accountable to create a safer user experience.”
According to the suit, in November 2022, “Snap employees were discussing 10,000 user reports of sextortion each month,” which is most likely “a small fraction of this abuse.”
In addition, the suit alleges internal documents show that reports of grooming and sextortion made about specific users largely went unanswered or unaddressed. In January 2022, the suit says, an employee disclosed over Slack that “over 90% of account-level reports are ignored today and instead we just prompt the person to block the other person.”
In August 2022, another employee said grooming and sextortion reports were “continuing to fall through the cracks,” according to the suit. Following the statement, another employee responded that they had identified a user who had been reported 75 times but remained active, it says. The account had allegedly mentioned “nudes, minors, and extortion,” and yet, no action was taken, according to the suit.
The suit also cited internal data collected by Snap that said that in 2022 a third of girls and 30% of boys on its platform had been exposed to unwanted contact. It alleged that last year, over half of Gen Z users said they or a friend had been targeted by people using false identities.
In addition, some former employees expressed frustration that the company “twiddled our thumbs and wrung our hands” rather than take quick action over the sale of drugs or the proliferation of child sexual abuse materials on its platform, according to the suit.
Snapchat and other social media platforms, like Instagram and TikTok, have been under intensified scrutiny from both the federal and state governments. While Instagram and TikTok have faced more criticism about how they affect their users’ mental health, Snapchat, which operates predominantly as a messaging platform, has been scrutinized for allowing minors to interact with adults, who could sexualize and exploit them. Leaders of Snap, TikTok, Meta, Discord and X all testified before Congress this year about child safety.
In a statement on its website Tuesday, Snap said it has built a platform with “safety guardrails” and “deliberate design choices” to protect minors.
“We continue to evolve our safety mechanisms and policies, from leveraging advanced technology to detect and block certain activity, to prohibiting friending from suspicious accounts, to working alongside law enforcement and government agencies, among so much more,” it said. “We care deeply about our work here and it pains us when bad actors abuse our service.”
The complaint and the cited documents echo safety concerns raised about Instagram after leaked internal documents from Meta showed the company knew the platform was dangerous for teen girls’ mental health.
Leave a Reply