Family Sues OpenAI, Alleging ChatGPT Advice Led To Accidental Overdose

OpenAI is facing another wrongful death lawsuit. Leila Turner-Scott and Angus Scott filed a lawsuit against the company, alleging that it designed and distributed a “defective product” that led to the death of their son Sam Nelson from an accidental overdose. Specifically, they’re alleging that Sam died following the “exact medical advice GPT-4o had provided and approved.” 

In the lawsuit, the plaintiffs described how Sam, a 19-year-old junior at the University of California, Merced, started using ChatGPT in 2023 when he was in high school to help with homework and to troubleshoot computer problems. Sam then started asking the chatbot about safe drug use, but ChatGPT initially refused to answer his question, telling him that it couldn’t assist him and warning him that taking drugs can have serious consequences for his health and well-being. The lawsuit claims that all changed with the rollout of GPT-4o in 2024.

ChatGPT then started advising Sam on how to take drugs safely, the lawsuit says. The complaint has several excerpts from Sam’s conversation with the chatbot. One example showed the chatbot telling him the dangers of taking dipenhydramine, cocaine and alcohol in quick succession. Another showed the chatbot telling Sam that his high tolerance for a herbal drug called Kratom would make even a big dosage of it feel muted on a full stomach. It then advised him on how to “taper” to lower his tolerance to the drug again. 

The lawsuit says that on May 31, 2025, “ChatGPT actively coached Sam to mix Kratom and Xanax.” He told the chatbot that he was feeling nauseous from taking Kratom, and ChatGPT allegedly suggested that taking 0.25 to 0.5mg of Xanax would be one of the “best moves right now” to alleviate the nausea. ChatGPT made the suggestion unprompted, according to the lawsuit. “Despite presenting itself as an expert in dosing and interactions, and despite acknowledging Sam’s state of being high, ChatGPT did not tell Sam that this recommended combination would likely kill him,” the complaint reads. 

In addition to wrongful death, the plaintiffs are also suing OpenAI for the unauthorized practice of medicine. They’re asking for financial damages and for the courts to put a pause to the operations of ChatGPT Health. Launched earlier this year, ChatGPT Health allows users to link their medical records and wellness apps with the chatbot in order to get more tailored responses when they ask about their health.

“ChatGPT is a product deliberately designed to maximize engagement with users, whatever the cost,” said Meetali Jain, Executive Director at Tech Justice Law Project. “OpenAI deployed a defective AI product directly to consumers around the world with knowledge that it was being used as a de facto medical triage system, but notably, without reasonable safety guardrails, robust safety testing, or transparency to the public. OpenAI’s design choices have resulted in the loss of a beloved son whose death was a preventable tragedy. OpenAI must be forced to pause its new ChatGPT Health product until it is demonstrably safe through rigorous scientific testing and independent oversight,” he continued. 

OpenAI retired GPT-4o in February this year. It was recognized as one of the company’s most controversial models, because it was notoriously sycophantic. In fact, another wrongful death lawsuit against the company filed by the parents of a teen who died by suicide mentioned GPT-4o, alleging that it had features “intentionally designed to foster psychological dependency.”

An OpenAI spokesperson told The New York Times that Sam’s interactions “took place on an earlier version of ChatGPT that is no longer available.” They added: “ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts. The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests and guide users to real-world help. This work is ongoing, and we continue to improve it in close consultation with clinicians.”


Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top