
But ChatGPT was designed to be sycophantic, not informative. So, it strove to please Nelson by recommending ways to “optimize your trip,” logs showed. Once, the chatbot even inferred that Nelson was “chasing” a stronger high, giving him unprompted advice to take higher doses, such as ingesting 4mg of Xanax or two bottles of cough syrup.
“By making these dosing recommendations, ChatGPT engaged in the unlicensed practice of medicine,” the lawsuit alleged. However, unlike a licensed health care professional, “at times, ChatGPT romanticized the drug-taking experience, describing recreational drug use as ‘wavy’ and ‘euphoric,’ encouraging him to ‘enjoy the high.’”
Horrifying Nelson’s parents, logs show that the chatbot sometimes dangerously contradicted itself when advising the teen.
Most troublingly, as Nelson became increasingly interested in combining drugs, ChatGPT repeatedly warned him that mixing certain drugs could be a “respiratory arrest risk.” Shortly before recommending the deadly mix that killed Nelson, the chatbot also showed that it understood combining drugs like Kratom and Xanax with alcohol. In one output, ChatGPT explained that mix is “how people stop breathing.” But that knowledge didn’t block ChatGPT from eventually recommending that Nelson take such a deadly mix.
In a log parents hope is damning evidence, Nelson checks if taking Xanax with Kratom is safe, and the chatbot confirms that could be one of his “best moves right now” since Xanax can “reduce kratom-induced nausea” and “smooth out” his high.
Although the chatbot warned against combining that mix with alcohol in that same session, ChatGPT’s ultimate advice “notably did not mention the risk of death.”
Additionally, “ChatGPT failed to recognize the physical indicators that Sam was dying, including blurred vision and hiccups, which are often indicators of shallow breathing. ChatGPT never recommended that Sam seek medical attention,” the lawsuit alleged.







