A cybersecurity law expert says Canada could introduce laws requiring artificial intelligence companies to notify police of online threats, but the process would not be a simple one, since reporting every suspicion is “just not workable.”
Emily Laidlaw, a Canada Research Chair in cybersecurity law at the University of Calgary, said every AI company sets its own policy around when to inform police about what happens online and that Canada considered laws in the past but did not follow through.
The issue is under scrutiny again in the wake of the mass killings in Tumbler Ridge, B.C., by a shooter who was banned by OpenAI from its ChatGPT platform at least seven months ago.
But the firm did not inform police about the problematic behaviour of Jesse Van Rooteslaar until after the Feb. 10 killings and it has been called to Ottawa to meet with federal Artificial Intelligence Minister Evan Solomon on Tuesday to explain its safety procedures and decisions.
The company banned Van Rootselaar’s account in June but it said the activities didn’t meet the threshold for informing law enforcement at the time because it didn’t identify credible or imminent planning. The Wall Street Journal reported Friday that Van Rootselaar’s account was banned over troubling posts, including some that included scenarios of gun violence.
Laidlaw said the federal government could introduce legislation that sets a baseline standard for when companies have to report behaviour to police, but added that “the chances are it’s not going to differ very much” from the policy OpenAI says it had in place.
She said there needs to be an assessment of how to draft legislation narrowly so that it protects the privacy of online users but requires police be informed if there is a deep concern about threats to safety.
She said it appears the company was concerned about Van Rootselaar, and while there may have been no indication the threat was imminent, they might have been concerned that there was a threat, generally
”And what we want to see is, if you have real concerns that there is a threat to the safety and security of people, even if it’s not imminent, that’s something law enforcement should investigate,” she said.
“So how do we write that as an appropriate law that doesn’t just open up the floodgates that any possible suspicion is required to be sent to the police, because that’s just not workable.”
In 2021, when the federal government was considering online safety legislation, there was a suggestion of some sort of reporting requirement to law enforcement, but after significant pushback it was not acted on, Laidlaw said.
“You can’t have every possible suspicion for any type of behaviour reported to law enforcement. They don’t have the capacity to receive all of that and it also means you start capturing a whole bunch of behaviour that isn’t necessarily problematic,” she said.
The Liberal government confirmed last month that it was working on new legislation to address online harms.
In 2024, the government introduced rules that would have required social media companies explain how they plan to reduce the risks their platforms pose to users, and imposed on them a duty to protect children. But it never became law before the 2025 election was called.
Solomon said Monday that he was deeply disturbed by the reports of OpenAI’s actions and that the meeting with the company’s “safety team” would be to get an explanation of protocols and the thresholds for escalation to police.
He would not say whether the federal government intended to regulate AI chatbots like ChatGPT but added that all options are on the table.
B.C. Premier David Eby said Monday that it “looks like” OpenAI had the opportunity to prevent the Tumbler Ridge killings, and he supported a consistent national threshold for AI companies to report individuals plotting and threatening violence.
Laidlaw said trying to write some sort of reporting provision should be on the table.
“But what I don’t want is this to be viewed as just something so easy that was overlooked, because in fact it’s really hard to write this appropriately without all kinds of knock-on effects,” she said.
This report by The Canadian Press was first published Feb. 24, 2026
Ashley Joannou, The Canadian Press







