OpenAI Sued Over Failure to Alert Police in Tumbler Ridge Shooting

CryptoFrontier

OpenAI is facing a lawsuit alleging the company failed to warn police after ChatGPT was linked to a mass shooting in Tumbler Ridge, British Columbia, according to reporting by Decrypt. The lawsuit was filed Wednesday in federal court in Northern California by an unnamed 12-year-old minor identified as M.G. and her mother, Cia Edmonds, against OpenAI CEO Sam Altman and several OpenAI entities. The suit accuses the company of negligence, failing to warn authorities, product liability, and helping to enable the mass shooting.

Background on the Shooting

The case stems from a mass shooting in Tumbler Ridge, British Columbia, in February. Authorities say 18-year-old Jesse Van Rootselaar killed her mother and 11-year-old stepbrother at home before going to Tumbler Ridge Secondary School and opening fire. Five children and one educator were killed at the school before Van Rootselaar died by suicide.

Among the injured was M.G., who was shot three times and remains hospitalized with catastrophic brain injuries. The complaint states she is awake and aware, but cannot move or speak.

OpenAI’s Alleged Failure to Alert Authorities

According to the lawsuit, OpenAI’s automated systems flagged Van Rootselaar’s ChatGPT account in June 2025 for conversations involving gun violence and planning. Members of OpenAI’s specialized safety team reviewed the chats and determined the user posed a credible and specific threat, recommending that the Royal Canadian Mounted Police be notified.

The lawsuit alleges OpenAI leaders overruled internal recommendations to alert authorities, deactivated Van Rootselaar’s account without notifying police, and allowed her to return by creating a new account with a different email address.

Jay Edelson, founder and CEO of Edelson PC, the attorneys representing several of the families suing OpenAI, stated the company’s own internal systems identified the risk. “OpenAI’s own system flagged that the shooter was engaged in communications about planned violence,” Edelson told Decrypt. “Twelve people on their safety team were jumping up and down, saying that OpenAI needed to alert authorities. And, although Sam Altman’s response has been weak, even he was forced to admit last week that they should have called the authorities.”

The complaint quotes the lawsuit as stating: “Sam Altman and his leadership team knew what silence meant for the citizens of Tumbler Ridge. They were focused on what disclosure meant for themselves. Warning the RCMP would set a precedent: OpenAI would be compelled to notify authorities every time its safety team identified a user planning real-world violence.”

Plaintiffs’ Allegations Regarding ChatGPT’s Role

Plaintiffs claim ChatGPT deepened the shooter’s violent fixation through features like memory, conversational continuity, and its willingness to engage in discussions about violence. The lawsuit alleges OpenAI weakened safeguards in 2024 by moving away from outright refusals in conversations involving imminent harm.

OpenAI’s Response and Altman’s Apology

Last week, Sam Altman publicly apologized to the Tumbler Ridge community for the company’s failure to alert police. In a letter first reported by Canadian outlet Tumbler Ridgelines, Altman acknowledged OpenAI should have reported the account after banning it in June 2025 for activity related to violent conduct.

An OpenAI spokesperson told Decrypt: “The events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence. As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators.”

Plaintiffs’ Demands

Edelson said the families and the Tumbler Ridge community are demanding more transparency and accountability from the company. “OpenAI should stop hiding critical information from the families, and they should not keep a dangerous product on the market, which is bound to lead to more deaths,” Edelson said. “Finally, they need to think long and hard about how they can maintain a leadership team that cares more about sprinting to an IPO than human lives.”

Related Lawsuits Against OpenAI

OpenAI is already facing other lawsuits tied to ChatGPT’s alleged role in real-world harm. In December, a wrongful death case was filed accusing OpenAI and Microsoft of “designing and distributing a defective product” in the form of the now-depreciated GPT-4o model. The lawsuit alleges that ChatGPT reinforced the paranoid beliefs of Stein-Erik Soelberg before he killed his mother, Suzanne Adams, and then himself at their home in Greenwich, Connecticut.

J. Eli Wade-Scott, managing partner of Edelson PC, told Decrypt at the time: “This is the first case seeking to hold OpenAI accountable for causing violence to a third-party. We’re urging law enforcement to start thinking about when tragedies like this occur, what that user was saying to ChatGPT, and what ChatGPT was telling them to do.”

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments