A coalition of more than a dozen states on Tuesday filed individual lawsuits against TikTok charging that the social media platform is deliberately designed to addict children and is harmful to their mental health.
The bipartisan group, led by the attorneys general for California and New York, includes a total of 13 states and the District of Columbia. They charge that TikTok violated state laws by falsely claiming its platform is safe for young people.
They charge that many of the video app's young users are struggling with poor mental health and body image issues due to the platform's addictive features and are getting injured or dying because of dangerous TikTok challenges that are created and promoted on the platform.
The lawsuits seek to stop those practices and impose financial penalties on the social media company.
"TikTok claims that their platform is safe for young people, but that is far from true," New York Attorney General Letitia James said in a statement, pointing to the deaths of young people caused by TikTok challenges and the mental-health problems suffered by others.
"Today, we are suing TikTok to protect young people and help combat the nationwide youth mental health crisis," she said. "Kids and families across the country are desperate for help to address this crisis and we are doing everything in our power to protect them."
TikTok released a statement saying that it strongly disagrees with the claims, adding that it believes many of them to be "inaccurate and misleading."
It argued that it has worked hard to protect teens, saying that it has voluntarily put in place "robust safeguards" and proactively removed suspected underage users.
"We've endeavored to work with the attorneys general for over two years, and it is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industry wide challenges," TikTok said.
The group of suing states say that's not the case and that TikTok's efforts have been more about public relations than protecting kids. Like other social media apps, TikTok is designed to keep users endlessly scrolling with an algorithm feeding each user exactly what they want to see.
They also noted that the app's highlighted "likes" and comments section acts as a form of social validation, potentially affecting teens' self esteem, along with its use of beauty filters that can alter a user's appearance, potentially hurting their self esteem.
They said the beauty filters have been especially harmful to young girls who end up thinking that they don't look good unless they use them to edit their features, which could put them at risk for body image issues, eating disorders, body dysmorphia and other health-related problems.
The lawsuits reflect a growing backlash against many social media companies in regards to how they market themselves to and treat their youngest users.
Last year, a group of more than 30 states filed suit against Meta, the parent company of Facebook and WhatsApp, claiming that they're also designed to be addictive and are damaging to the mental health of kids. That case is still pending, but Meta has since rolled out sweeping safety updates designed to protect teens.
TikTok is also dealing with a U.S. Department of Justice lawsuit accusing it and its China-based parent company ByteDance of violating children's privacy laws. In the suit, the government says that the site knowingly allowed children under 13 years old to make and use accounts without parental consent, collected "extensive data" from those children, and didn't delete the accounts and data even when parents asked for that.
And TikTok still faces a U.S. ban starting in January unless ByteDance sells it to a buyer deemed fit by U.S. officials.
Lawmakers on both sides of the aisle have said that TikTok could be used by China's government to spy on Americans or otherwise threaten national security. TikTok has denied those allegations.
Some free speech and digital rights groups also oppose the ban saying that what's really needed is a set of comprehensive digital privacy laws that would protect Americans' personal information, not singling out TikTok. But if the government gets its way, it could require the removal of TikTok from U.S. app stores.