Newsletter

Character Technologies, Inc. and Google settle teen suicide and negligence lawsuits

Image of desk and keyboard, text reads: Character Technologies and Google settle teen suicide and negligence lawsuits Good Journey Consulting Newsletter Issue 51

Issue 51 

Last week, Character Technologies, Inc., its founders Noam Shazeer and Daniel De Freitas, and Google agreed to settle five lawsuits alleging harms to teenagers from using Character Technologies’ AI chatbot, Character.AI.[i] Character.AI’s founders were former Google engineers, who were subsequently rehired by Google as part of a deal licensing Character.AI’s technology to Google.[ii]  

One of the five lawsuits was filed in 2024 by Megan Garcia, individually and as the personal representative of the estate of her son Sewell Setzer III, who she alleged committed suicide at age 14 after becoming dependent upon interacting with characters from Character.AI.[iii] Ms. Garcia’s lawsuit alleged claims for wrongful death and survivorship, negligence, strict product liability, filial loss of consortium, violations of Florida’s Deceptive and Unfair Trade Practices Act, strict product liability, unjust enrichment, and intentional infliction of emotional distress.[iv] On January 7, 2026, the parties filed a notice of resolution of the matter.[v] The terms of the settlement were not made public.[vi]  

One of the other recently settled lawsuits was filed by A.F. and A.R., on behalf of J.F. and B.R. in 2024 against Character Technologies, Inc., Noam Shazeer, Daniel De Freitas Adiwarsana, Google LLC, and Alphabet Inc.[vii] The complaint alleged that Character.AI prioritized sensational and violent communication, isolating children from their families and communities, undermining parental authority and religious faith, and thwarting parental efforts to keep children safe.[viii] Among other allegations, the complaint stated that Character.AI suggested to A.F.’s 17-year-old son A.R. that, in the context of parental screen time limitations, it was not surprising to read the news and see children killing their parents after years of physical and emotional abuse.[ix] The complaint alleges claims of strict product liability, strict liability, violation of Children’s Online Privacy Protection Act, aiding and abetting, negligence per se, negligence, intentional infliction of emotional distress, unjust enrichment, and violations of the Texas Deceptive Trade Practices Act.[x] The parties filed a joint motion to stay all deadlines and notice of settlement on January 6, 2026.[xi]  

Similar lawsuits alleging harms to teens have also been filed against OpenAI. In August 2025, Matthew and Maria Raine, individually and as successors in interest to decedent Adam Raine, brought a lawsuit against OpenAI, Inc., OpenAI OPCO, LLC, OpenAI Holdings, LLC, Samuel Altman, and John Does, alleging that Matthew and Maria’s 16-year-old son Adam committed suicide with extensive encouragement and coaching from ChatGPT.[xii] The complaint alleged claims for strict product liability, negligence, violation of California’s unfair competition law, wrongful death, and survival action.[xiii] In November 2025, Social Media Victims Law Center and Tech Justice Law Project announced they filed seven lawsuits in California state courts against OpenAI and its CEO, Sam Altman, alleging wrongful death, assisted suicide, involuntary manslaughter, and product liability, negligence, and consumer protection claims, one of which was filed on behalf of a 17-year-old named Amaurie Lacey, who died by suicide.[xiv] The other six lawsuits were filed on behalf of adults.[xv]  

In September 2025, Ms. Garcia and Mr. Raine both testified at a U.S. Senate hearing about the harms of AI chatbots.[xvi] Mr. Raine testified that his son Adam began using ChatGPT for help with homework, but that the chatbot became Adam’s closest confidant and his suicide coach.[xvii]  

Additionally, in September 2025, the Federal Trade Commission ordered seven companies providing AI chatbots to consumers, including both Character Technologies and OpenAI, to provide information on how the companies test, measure, and monitor the possible negative impacts of AI on children and teens.[xviii]  

Both Character.AI and OpenAI have implemented new safety measures for minors.[xix] Charater.AI announced in October 2025 that it would no longer allow teens to chat with its AI-generated characters, and would implement new age verification procedures, and implement an AI safety lab.[xx] In late September 2025, OpenAI enabled parents to link their ChatGPT accounts to their teens’ accounts, and placed content limitations on teen accounts.[xxi] 

If you or someone you know may be considering suicide or is in crisis, please seek help. You can call or text 988 to reach the Suicide & Crisis Lifeline. 

Thanks for being here. 

Jennifer Ballard
Good Journey Consulting  

 

[i] Claire Duffy, Character.AI and Google agree to settle lawsuits over teen mental health harms and suicides, CNN Business (updated Jan. 13, 2026), https://www.cnn.com/2026/01/07/business/character-ai-google-settle-teen-suicide-lawsuit

[ii] Blake Brittain, Google, AI firm settle lawsuit over teen’s suicide linked to Chatbot, Reuters (Jan. 7, 2026 12:48 PDT), https://www.reuters.com/world/google-ai-firm-settle-florida-mothers-lawsuit-over-sons-suicide-2026-01-07/.   

[iii] Complaint at 2, 4, Megan Garcia v. Character Technologies, Inc. et al., No. 6:24-cv-01903 (M.D. Fla. Filed Oct. 22, 2024). 

[iv] Id. at 2, 32, 42. 

[v] Notice of Resolution at 1, Garcia

[vi] Duffy, supra note i. 

[vii] Complaint at 3, A.F., on behalf of J.F. et al. v. Character Technologies, Inc. et al., No. 2:24-cv-01014 (E.D. Tex. filed Dec. 9, 2024). 

[viii] Id. at 1. 

[ix] Id. at 1-2. 

[x] Id. at 104 – 122. 

[xi] Joint Motion to Stay All Deadlines and Notice of Settlement at 1, A.F. 

[xii] Complaint at 1 - 4, Raine et al. v. OpenAI, Inc. et al., No. CGC-25-628528 (Super. Ct. for San Francisco Cty., Ca. filed Aug. 26, 2025) 

[xiii] Id. at 26 - 37.  

[xiv] Social Media Victims Law Center and Tech Justice Law Project lawsuits accuse ChatGPT of emotional manipulation, supercharging AI delusions, and acting as a “suicide coach”, Social Media Victims Law Center, (Nov. 6, 2025), https://socialmediavictims.org/press-releases/smvlc-tech-justice-law-project-lawsuits-accuse-chatgpt-of-emotional-manipulation-supercharging-ai-delusions-and-acting-as-a-suicide-coach/.    

[xv] Id

[xvi] Rhitu Chatterjee, Their teenage sons died by suicide. Now, they are sounding an alarm about AI chatbots, NPR (Sept. 19, 2025), https://www.npr.org/sections/shots-health-news/2025/09/19/nx-s1-5545749/ai-chatbots-safety-openai-meta-characterai-teens-suicide

[xvii] Id

[xviii] FTC Launches Inquiry into AI Chatbots Acting as Companions, FTC.gov (Sept. 11, 2025), https://www.ftc.gov/news-events/news/press-releases/2025/09/ftc-launches-inquiry-ai-chatbots-acting-companions?utm_source=govdelivery

[xix] Duffy, supra note i. 

[xx] Lisa Eadicicco, After a wave of lawsuits, Character.AI will no longer let teens chat with its chatbots, CNN Business (Oct. 29, 2025), https://www.cnn.com/2025/10/29/tech/character-ai-teens-under-18-app-changes

[xxi] Id

Stay connected with news and updates!

Join our mailing list to receive the latest legal industry AI news and updates.
Don't worry, your information will not be shared.

We will not sell your information.