Several attacks involving OpenAI’s chatbot—including Tumbler Ridge and FSU—raise urgent questions about the technology.

“From the outside, it looks like OpenAI had the opportunity to prevent this horrific loss of life, to prevent there from being dead children,” said BC Premier David Eby after the Journal reported on the shooter’s ChatGPT use. “I’m angry about that. I’m trying hard not to rush to judgment.” Canadian authorities demanded accountability and vowed to create new national requirements for tech companies to report threats brewing on their platforms.

OpenAI told Canadian government leaders in late February that under the company’s newly revised protocols, the shooter’s account from June 2025, if discovered today, would be flagged to law enforcement. “Mental health and behavioural experts now help us assess difficult cases, and we have made our referral criteria more flexible to account for the fact that a user may not discuss the target, means, and timing of planned violence in a ChatGPT conversation but that there may be potential risk of imminent violence,” VP of Global Policy Ann O’Leary stated in an open letter