By AP and ReutersPublished On 29 Apr 202629 Apr 2026The families of victims of a school shooting in a remote Canadian Rockies town are suing artificial intelligence company OpenAI in a United States federal court, alleging that the ChatGPT maker failed to alert police to the shooter’s alarming interactions with the chatbot.A lawsuit filed on Wednesday on behalf of 12-year-old Maya Gebala, who was critically injured in the February shooting, is among the first of more than two dozen cases from families in Tumbler Ridge, British Columbia, in what their lawyers say represents “an entire community stepping forward to hold OpenAI accountable”.Recommended Stories list of 4 itemsend of listSix other lawsuits filed in a San Francisco federal court allege wrongful death claims on behalf of five children and an educator killed in Canada’s deadliest mass shooting in years.The cases represent the families of the five slain children targeted in the school shooting. Those include Zoey Benoit, Abel Mwansa Jr, Ticaria “Tiki” Lampert, Kylie Smith, all 12, and Ezekiel Schofield, 13, as well as education assistant Shannda Aviugana-Durand.Jesse Van Rootselaar, whose interactions with ChatGPT are at the centre of the lawsuits, shot her mother and stepbrother at home before killing an educational assistant and five students aged 12 to 13 at her former school on February 10 , according to police. Van Rootselaar, who was 18, then died by suicide. Twenty-five people were also injured in the attack.An OpenAI spokesperson called the shooting “a tragedy” and said the company has a zero-tolerance policy for using its tools to assist in committing violence.“As we shared with Canadian officials, we have already strengthened our safeguards, including …