Families of seven victims of a mass shooting at a secondary school in British Columbia are suing OpenAI and the company’s CEO for negligence after it failed to alert authorities to the shooter’s troubling conversations with ChatGPT.
The lawsuits, filed on Wednesday in a federal court in San Francisco, allege that the violent intentions of the shooter, identified as 18-year-old Jesse Van Rootselaar, were well-known to OpenAI. Employees at the company flagged the shooter’s account eight months before the attack and determined that it posed “a credible and specific threat of gun violence against real people”, according to the lawsuit.
The families allege that employees urged Sam Altman, OpenAI’s CEO, and other senior leaders to notify Canadian law enforcement agencies eight months before the attack, but the company decided not to warn authorities and deactivated the shooter’s account instead. Much of this is based on accounts that employees inside the company told the Wall Street Journal.
The decision to not alert law enforcement led to the devastation of the rural community of Tumbler Ridge, the suit alleges, where on 10 February the shooter stormed the secondary school with a modified rifle and opened fire. They shot the first person they came across in a stairwell, and proceeded to the library, where they killed five others and injured 27 more. The shooter then killed themself.
Before going to the school, the shooter killed their mother and 11-year-old brother in their family home.
The school victims range in age from 12 to 13 and include a 39-year-old teaching assistant. One of the survivors, 12-year-old Maya Gebala, was shot in the head, neck and cheek. She has been in intensive care at Vancouver’s children’s hospital since the shooting and has received four brain operations. If she survives, she will probably have permanent disabilities, her attorneys said.
The families who brought the seven lawsuits accuse OpenAI and Altman of negligence, aiding and abetting a mass shooting, wrongful death, and product liability. Their lawyers say it is the first wave of suits against the AI company over the shooting, and about two dozen more cases are forthcoming.
In a statement to the Guardian, OpenAI said: “The events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence. As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators.”
After the Guardian approached it for comment, OpenAI published a new blogpost about its “commitment to safety” and how it “protects community safety”.
The attack was one of the deadliest mass shootings in Canadian history. In the aftermath, questions swirled in the small community about how it could have happened.
Van Rootselaar’s ChatGPT account was banned eight months before the shooting, after OpenAI’s safety team flagged it for violent conversations, according to the lawsuit. However, the shooter was able to quickly create a new one, the suit alleges.
Although OpenAI says that the shooter created a second account the company was unaware of until after the shooting, the lawsuits say the company provides users with instructions on how to return to ChatGPT if they are deactivated, which the shooter followed.
“The fact that Sam and the leadership overruled the safety team, and then children died, adults died, the whole town was ruined, is pretty close to the definition of evil to me,” said Jay Edelson, the lead lawyer representing the Tumbler Ridge plaintiffs.
The lawsuit alleges that the choice to conceal the shooter’s interactions with ChatGPT from Canadian authorities, and later tell the public that the shooter sneaked back on to the platform, was made in the interest of “corporate survival” and to protect the company’s IPO, which has an expected valuation of $1tn and could make Altman one of the wealthiest people in the world.
OpenAI has declined to share the logs between its chatbot and the Tumbler Ridge shooter, Edelson said.
Late last week, Altman sent a letter to the Tumbler Ridge community apologizing for not notifying Canadian police about what OpenAI knew regarding the shooter’s potential threat.
“While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered,” Altman wrote. “I reaffirm the commitment I made to the mayor and the premier to find ways to prevent tragedies like this in the future.”
David Eby, the British Columbia premier, posted the letter to social media with the comment: “The apology is necessary, and yet grossly insufficient for the devastation done to the families of Tumbler Ridge.”
On 26 February, a little over two weeks after the shooting, OpenAI’s vice-president of global policy, Ann O’Leary, sent a letter to Evan Solomon, Canada’s minister of artificial intelligence and digital innovation. O’Leary wrote that based on what the company saw when the shooter’s account was deactivated, it did not “identify credible and imminent planning that met our threshold to refer the matter to law enforcement”. This decision came despite the warnings from OpenAI’s safety team that the account should have been reported.
O’Leary also spelled out the actions the company was planning to take, such as strengthening their relationship with Canadian law enforcement and bulking up its system to detect users who are repeatedly banned from ChatGPT but subsequently make new accounts.
The lawsuits are part of a groundswell of cases against AI companies over allegations that their chatbots are exacerbating mental health crises and provoking violent acts. In November, seven complaints were filed against OpenAI, blaming ChatGPT for acting as a “suicide coach”. Google was sued last month after its Gemini chatbot allegedly encouraged a 36-year-old man to stage a “catastrophic accident” and then kill himself. Google has said it is working to improve its safeguards and OpenAI said it was reviewing the lawsuit’s filings.
In Florida, the attorney general recently opened a criminal investigation into OpenAI after reviewing messages between ChatGPT and a gunman accused of committing a mass shooting on the Florida State University campus – the first such criminal inquiry into a tech company. Lawyers for the Tumbler Ridge families say they believe their cases could support similar criminal liability against the company. The company told NBC News it was not responsible for the shooting and has answered the state’s questions.
It’s another example of the now-common approach of using lawsuits to hold entities such as gunmakers and dealers and the US federal government accountable for alleged inaction that has led to shooting deaths and injuries.
The seven Tumbler Ridge lawsuits are filed on behalf of Gebala, the family of the teaching assistant, Shannda Aviugana-Durand, and the families of five of the children who died in the school shooting. Those victims include Zoey Benoit, Ticaria “Tiki” Lampert, Kylie Smith, Ezekiel Schofield and Abel Mwansa Jr. The families say the loss is unbearable.
Mwansa’s parents, who immigrated to Canada from Zambia three years ago, say their 12-year-old was a good listener who made his sister breakfast every morning. One of his friends who survived the shooting said Mwansa’s final words were: “Tell my parents that I love them so much.”







