Families of Canada school shooting victims sue OpenAI over shooter’s use of ChatGPT
Loading article...
The families of victims of a school shooting in a Canadian Rockies town are suing artificial intelligence company OpenAI in US federal court, seeking to hold the ChatGPT maker responsible for failing to alert police to the shooter’s alarming interactions with the chatbot.
A lawsuit filed Wednesday on behalf of 12-year-old Maya Gebala, who was critically injured in the February shooting, is among the first of dozens of cases that families in Tumbler Ridge, British Columbia are planning with claims alleging wrongful death, negligence and product liability.
Plaintiffs’ attorney Jay Edelson said in an interview that decisions made by OpenAI and its CEO Sam Altman “have destroyed the town.
The people are really resilient, but what happened is unimaginable.”
Altman sent a letter last week formally apologizing to the community that his company did not notify law enforcement about the shooter’s online behaviour.
Authorities have said the shooter killed her mother and 11-year-old stepbrother in their home on February 10 before opening fire at the nearby Tumbler Ridge Secondary School, killing five children and an educator before killing herself. Twenty-five people were also injured in the attack, Canada’s deadliest mass shooting in years.
The case highlights concerns about the harms posed by overly agreeable AI chatbots and what obligations the tech industry has to control them or notify authorities about planned violence by chatbot users.
This month, prosecutors investigating the deaths of two University of South Florida doctoral students said that the suspect asked ChatGPT about body disposal in the lead-up to the students’ disappearance.
In response to the lawsuit, OpenAI said in a written statement that the “events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence.”
“As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators,” the company said.
Follow The Gleaner on X, formerly Twitter, and Instagram @JamaicaGleaner and on Facebook @GleanerJamaica. Send us a message on WhatsApp at 1-876-499-0169 or email us at onlinefeedback@gleanerjm.com or editors@gleanerjm.com.