Instagram says it’s rolling out a suite of new settings to fight sextortion, a type of blackmail that uses sexual images or conversations to pressure victims into paying money.
The crime is becoming more common, Instagram owner Meta says, with some organized groups working together to pressure teenagers into sharing nude photos. Meta said in a statement Thursday that the settings such as hiding teens’ follower lists from potential blackmailers make it harder for criminals to connect with young people on the photo-sharing app and give teens more resources to learn the signs of a sex scam.
Instagram will also show teens in the United States, United Kingdom, Canada and Australia a video in their feeds about how to “spot sextortion scams,” the release says. The app’s “nudity protection” feature, which blurs potentially nude images and nudges teens before they send one, will be on automatically for teen accounts globally. Users can no longer screenshot ephemeral messages in the app’s chat — one way that scammers get hold of sensitive images, said Meta’s global head of safety, Antigone Davis, in an interview.
The update comes just weeks after Meta rolled out new “teen accounts” with extra protections and more visibility for parents. Facing outcry from parents and legislators about teen safety online, the company says it’s working with child safety experts to continually build out its features. Critics have pushed back, saying Meta uses incremental safety improvements to distract from larger issues — such as Instagram’s alleged negative effects on the health of children and teens.
Davis said sextortion is a growing problem on Instagram. Scammers befriend a victim, often claiming to live in the same country and be the same age, then ask for explicit photos. Scammers threaten victims with exposing the pictures and ask for money to keep them private.
“What we’re seeing is a financial crime now, where people are doing this to extort money,” Davis said, adding that Instagram changes its features in response to evolving threats.
Part of this effort happens behind the scenes: Meta said it’s taken down 63,000 accounts associated with sextortion rings, including thousands of accounts from the same Nigeria-based collective.
Instagram has steadily added what it calls teen safety features during the last three years, but pressure on the company to address online harms including sex abuse and bullying has still mounted. Last year, 41 states and D.C. sued Meta for allegedly building addictive features into its apps. A handful of states including Florida and Ohio have passed laws restricting social media use among teens. Some advocates and legislators, meanwhile, argue that tech giants like Meta should have to publicly share their internal research on teen safety and take legal responsibility for the content that appears on their apps.
“Meta has known sextortion is happening at scale for so long,” said Annie Seifullah, a lawyer who’s worked on more than a hundred sextortion cases, about half of which involved Meta apps. “This feels like too little, too late.”
Sextortion is getting easier as criminals use automation to contact many potential victims at once, Seifullah said.
This year, the FBI warned that young men between 14 and 17 are increasingly being targeted by sextortion scammers — many of whom are located outside the United States. Financially motivated sextortion incidents involving minors increased 20 percent between October 2022 and March 2023 compared with the same time period the year before, according to the FBI. Those crimes can have dangerous consequences: From October 2021 to March 2023, federal officials identified at least 12,600 victims. Twenty of the cases were suicides, according to the FBI.
How Meta handles intimate images of teens has been thrust into the spotlight following news reports about how its services have not protected vulnerable users. This year, reports in the New York Times and the Wall Street Journal chronicled how men paid to see explicit images of young girls — who were often too young to have their own accounts — through accounts managed by their parents. In 2023, the Stanford internet Observatory found that large networks of accounts on Twitter and Instagram that appeared to be operated by minors openly advertised self-generated child sexual abuse material for sale.
Meta’s newly launched teen accounts on Instagram have faced criticism from activists and regulators who said they did not change fundamental aspects of the photo-sharing app that encouraged younger users to keep checking Instagram in the first place, such as “like” counts on their posts and an endless stream of content in their feeds. While Meta said it would use verification tools to proactively find young users who lie about their age, critics say teens will find ways around Instagram’s new rules.
In 2023, Meta funded Take It Down — a new program run by the National Center for Missing & Exploited Children that aims to let young people proactively scan a select group of websites for their images and have them taken down. Instagram has also tried to limit teens’ interactions with adults that appear to be engaging in suspicious behavior.