pwshub.com

Character AI Imposes New Safety Rules After Teen User Commits Suicide

AI-powered chatbot platform Character AI is introducing “stringent” new safety features following a lawsuit filed by the mother of a teen user who died by suicide in February.

The measures will include “improved detection, response and intervention related to user inputs that violate our Terms or Community Guidelines,” as well as a time-spent notification, a company spokesperson told Decrypt, noting that the company could not comment on pending litigation.

However, Character AI did express sympathy for the user’s death, and outlined its safety protocols in a blog post Wednesday. 

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.ai tweeted. “As a company, we take the safety of our users very seriously.”

We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features that you can read about here:…

— Character.AI (@character_ai) October 23, 2024

In the months before his death, 14-year-old Florida resident Sewell Setzer III had grown increasingly attached to a user-generated chatbot named after Game of Thrones character Daenerys Targaryen, according to the New York Times. He often interacted with the bot dozens of times per day and sometimes exchanged romantic and sexual content.

Setzer communicated with the bot in the moments leading up to his death and had previously shared thoughts of suicide, the Times reported. 

Setzer’s mother, lawyer Megan L. Garcia, filed a lawsuit Tuesday seeking to hold Character AI and its founders, Noam Shazeer and Daniel De Freitas, responsible for her son’s death. Among other claims, the suit alleges that the defendants “chose to support, create, launch, and target at minors a technology they knew to be dangerous and unsafe,” according to the complaint. Garcia is seeking an unspecified amount of damages.

Google LLC and Alphabet Inc. are also named defendants in the suit. Google rehired Shazeer and De Freitas, both of whom left the tech giant in 2021 to found Character AI, in August as part of a $2.7 billion deal that also included licensing the chatbot startup’s large language model. 

Along with other safety measures, Character AI has “implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation,” the company’s statement said. It will also alter its models “to reduce the likelihood of encountering sensitive or suggestive content” for users under 18 years old. 

Character AI is one of many AI companionship apps on the market, which often have less stringent safety guidelines than conventional chatbots like ChatGPT. Character AI allows users to customize their companions and direct their behavior. 

The lawsuit, which comes amid growing concerns among parents about the psychological impact of technology on children and teenagers, claims that his attachment to the bot had a negative effect on his mental health. Setzer received a diagnosis of mild Asberger’s as a child and had recently been diagnosed with anxiety and disruptive mood dysregulation disorder, the Times reported. 

The suit is one of several moving through the courts that are testing legal protections provided to social media companies under Section 230 of the Communications Decency Act, which shields them from liability associated with user-generated content. TikTok is petitioning to rehear a case in which a judge ruled that it could be held liable after a 10-year-old girl died while trying to complete a “blackout challenge” that she saw on the app. It's the latest problem for Character AI, which came under fire last month for hosting a chatbot named after a murder victim. 

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.

Source: decrypt.co

Related stories
1 month ago - Bluesky, a decentralized alternative to X is flooded with new users as Elon Musk refuses to comply with orders from Brazil's Federal Court.
3 weeks ago - AI avatar platform Character AI removed the chatbot after the dead girl’s uncle, the founder of Kotaku, complained.
3 weeks ago - While the world waits for OpenAI to release Sora, Chinese companies are gaining a lead with models that produce amazing results.
1 month ago - Replika AI founder and CEO Eugenia Kuyda shares how Replika AI companions help people—and where the tech is going next.
1 month ago - The musuem has teamed up with UCA and AI firm StarPal to create "Lizzy," an avatar based on 'Pride and Predjudice' heroine Elizabeth Bennet.
Other stories
3 minutes ago - An 81 year-old woman says she's lost her entire life savings to a Social Security scam. Washington, D.C.-based Gladys Baxley says she recently received a call from someone claiming to be with the Social Security Administration, reports...
20 minutes ago - Brad Garlinghouse painted Kamala Harris as a savvy Silicon Valley insider with a "nuanced" approach to crypto, while saying Donald Trump came out "aggressively" on the subject.
21 minutes ago - A seemingly casual exchange between Coinbase's Brian Armstrong and an online Twitter bot has sent one meme coin soaring.
27 minutes ago - According to veteran trader Peter Brandt, Ethereum might have just seen its future looking brighter. Known for his technical forecasts, Brandt feels the altcoin is on the verge of a bullish turnaround. He’s identified an inverted Head and...
1 hour ago - Binance executive Tigran Gambaryan’s charges of money laundering have reportedly been dropped in Nigeria after he spent seven months in prison. According to local reports, the Nigerian government is dropping all charges against Gambaryan,...