Jennfier Gauthier/Reuters

Good morning, everyone.

Wendy Cox is off this week. This is Mark Iype in Alberta filling in.

It’s been nearly three months since the shooting in the northern British Columbia community of Tumbler Ridge left nine people dead, including the perpetrator.

And while victims’ families continue to mourn, there remain many questions about how this tragedy unfolded and who bears responsibility for what transpired on Feb. 10.

But last week, fingers were pointed at a tech giant that some allege played a role in one of the deadliest mass shootings in modern Canadian history.

Seven lawsuits were filed in U.S. federal court in San Francisco against OpenAI and its chief executive officer, Sam Altman, on behalf of victims of the shooting. They allege the company’s negligence and the design defects of its flagship ChatGPT chatbot pushed the shooter toward violence.

The suits also all allege that the company avoided alerting police last year about the shooter’s violent interactions with its program because the tech giant would then be forced to create an internal system for reporting other violent users to the authorities. The lawsuits allege that would expose that ChatGPT poses a threat to human life.

None of the allegations have been tested in court.

The victims represented in the suits include a 12-year-old girl who remains hospitalized after getting shot in the head, five students between the ages of 12 and 13 who were killed, and Shannda Aviugana-Durand, an educational assistant who was also killed at the school.

The families are all seeking punitive damages and the recovery of their legal costs, with those who lost loved ones also seeking pre-death economic losses.

Less than a week before the lawsuits were filed, Altman formally apologized for his company’s role in the mass shooting.

“I am deeply sorry that we did not alert law enforcement to the account that was banned in June. While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered,” Altman wrote in a letter dated April 23, and publicly released the next day.

Jesse Van Rootselaar, 18, killed eight people in Tumbler Ridge before killing herself in the deadly attack. Before she entered the town’s school, she killed her mother and 11-year-old half-brother in their family home.

Van Rootselaar’s conversations on OpenAI’s ChatGPT platform months earlier had raised red flags within the company, but were not reported to law enforcement.

In a statement, OpenAI spokesperson Jamie Radice said the company has already strengthened its safeguards in response to the tragedy. The statement also said OpenAI has improved how it assesses potential threats of violence by its users and is getting better at rooting out people who repeatedly violate its policies.

In response to the suits, B.C. Premier David Eby said two key questions remain: What was in the shooter’s chats that concerned multiple OpenAI staffers enough that they wanted to call police and why was the decision made not to do so?

Jay Edelson, lead counsel for the plaintiffs in the U.S. lawsuits, said he expects to eventually file more than two dozen suits against Altman and OpenAI on behalf of Tumbler Ridge victims, with these American suits superseding the Canadian lawsuit filed recently by the family of the girl recovering from her gunshot wounds. These lawsuits, he said, will be grouped together as a mass action that allows a “few bellwether” cases to be picked to proceed to a trial before the others.

“We’re very eager to put [Mr. Altman] squarely on trial and put the DNA of OpenAI on trial so people understand exactly how these decisions are made, before it’s too late,” Edelson said.

This is the weekly British Columbia newsletter written by B.C. Editor Wendy Cox. If you’re reading this on the web, or it was forwarded to you from someone else, you can sign up for it and all Globe newsletters here.