Minister ‘disappointed’ in OpenAI, but why is AI regulation taking years?


Federal ministers who met with representatives of OpenAI expressed disappointment Wednesday that the company did not present steps it will take to improve its safety measures — including when police are warned of a user’s online behaviour.

Experts in the field, however, are questioning why the federal government has been slow to regulate artificial intelligence before concerns were raised this month following the Tumbler Ridge, B.C., mass shooting.

Artificial Intelligence Minister Evan Solomon said he is giving the company a chance to update him in the coming days on “concrete” actions before he and other ministers address the issue through legislation, though he noted a series of bills addressing AI safety and privacy are in the works.

“Look, we told this company we want to see some hard proposals, some concrete action,” Solomon told reporters in Ottawa while heading into a Liberal caucus meeting.

Story continues below advertisement

“We’re disappointed that by the time they came here, they did not have something more concrete to offer, but we’ll see very shortly what they have,” he added, noting that “all options” were on the table for how the government might act.

Solomon summoned representatives of the company behind ChatGPT to Ottawa after it emerged that the shooter who killed eight people in Tumbler Ridge on Feb. 10 was flagged internally last June for her activity on the AI chatbot.

OpenAI did not alert the RCMP until after the mass shooting occurred, saying the “violent” activity did not meet the internal threshold of an “imminent” threat when the account was flagged and banned over seven months prior.


Click to play video: 'AI concerns following Tumbler Ridge shooting'


AI concerns following Tumbler Ridge shooting


Justice Minister Sean Fraser, Public Safety Minister Gary Anandasangaree and Culture and Identity Minister Marc Miller — whose ministry is working on new online harms legislation — were also present at the meeting.

Story continues below advertisement

Prime Minister Mark Carney told reporters Wednesday he had not yet been briefed on the OpenAI meeting, but suggested he would be open to changes.

“I sat with the families of Tumbler Ridge, met with the first responders, saw the horror that — what happened and the pain that’s been caused,” he said.

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

Get breaking National news

For news impacting Canada and around the world, sign up for breaking news alerts delivered directly to you when they happen.

“Obviously, anything that anyone could have done to prevent that tragedy or future tragedies must be done. We will fully explore it to the full lengths of the law and we’ll be very transparent about that process.”


Solomon and other ministers who were at the meeting said any action the government takes would focus on the threshold used to escalate concerning behaviour to law enforcement.

“There are issues around the assessment on credibility of a threat and the imminence of a thread that in my view, if properly administered, could prevent tragedies on a go-forward basis,” Fraser said.

“The message that we delivered, in no uncertain terms, was that we have an expectation that there are going to changes implemented, and if they’re not forthcoming very quickly, the government is going be making changes.”

OpenAI told Global News Tuesday evening that the company appreciated the “frank discussion on how to prevent tragedies like this in the future.”

Story continues below advertisement

“Over the past several months, we have taken steps to strengthen our safeguards and made changes to our law enforcement referral protocol for cases involving violent activities, but the ministers underscored that Canadians expect continued concrete action and we heard that message loud and clear,” a spokesperson said.

“We’ve committed to follow up in the coming days with an update on additional steps we’re taking, as we continue to support law enforcement and work with the government on strengthening AI safety for all Canadians.”

OpenAI did not detail exactly what changes have been made in recent months, and did not immediately respond to Global News’ request for comment Wednesday.

Why aren’t any new rules in place?

Researchers who study online harms and AI say the Tumbler Ridge incident shows the AI industry shouldn’t be left to regulate itself, and that the government needs to be more proactive.

Story continues below advertisement

“The ministers ought to be looking at themselves as the ones who are responsible for undertaking regulation seriously when it comes to ChatGPT and other similar tools,” said Jennifer Raso, an assistant professor in law at McGill University.

“Pulling people up to Ottawa after one of the most horrible mass shootings in Canada to have them account for themselves after the harm’s been done seems to be too little, too late.”

Efforts to regulate the AI industry and address online harms through legislation died in Parliament last year ahead of the federal election.

The Artificial Intelligence and Data Act would have required AI companies to ensure its platforms are monitored for safety concerns and misuse, while enacting “proactive” measures to prevent real-world harm.


Click to play video: 'OpenAI summoned to Ottawa over Tumbler Ridge shooting'


OpenAI summoned to Ottawa over Tumbler Ridge shooting


Solomon has promised to unveil a new federal AI strategy in the first quarter of this year, delaying its launch from late 2025.

Story continues below advertisement

In a speech last year, he said Ottawa would avoid “over-indexing on warnings and regulation,” reflecting the Carney government’s emphasis on AI’s economic benefits and speedy adoption of the technology.

A summary of public comments submitted during consultation on the forthcoming strategy showed Canadians are deeply skeptical of AI and want to see government regulation, particularly addressing online harms and mental health concerns.

While allies like the United Kingdom and European Union have moved to strengthen AI regulation, attempts to do so in the U.S. have been sporadic. U.S. President Donald Trump has ordered states not to pass regulations before a national strategy is in place, but that federal standard has yet to emerge.

Canada’s privacy legislation says private companies “may” — not must — disclose personal information to authorities or another organization if they believe there is a risk of significant harm or that a law will be broken.

Any further decision-making is up to the company itself, leading to internal thresholds like OpenAI’s “imminent” threat identification.

Solomon said Wednesday that work is underway to update the Personal Information Protection and Electronic Documents Act, but did not say when it will be tabled or offer further details.

Anandasangaree expressed confidence that the investigation into the shooting will yield answers, including from OpenAI.

Story continues below advertisement

“The number of issues arising around Tumbler Ridge concern me,” he told reporters after Wednesday’s caucus meeting.

“Yesterday’s meeting was a critical first step with OpenAI. There’s still a lot of unanswered questions, and there’s certainly a sense of frustration and, frankly, a sense that tech companies overall are not doing enough to address the issues around information that they hold.”

Solomon emphasized that the government wants to make sure what happened in Tumbler Ridge “does not happen again.”

“Of course a failure occurred here,” he said. “I mean, look what happened.”

—with files from Global’s Touria Izri



Source link

  • Related Posts

    Hockey’s Greg Westlake, nordic skier Mark Arendz, curler Ina Forrest front Canada’s Paralympic team

    Listen to this article Estimated 1 minute The audio version of this article is generated by AI-based technology. Mispronunciations can occur. We are working with our partners to continually review…

    France’s Engie strikes deal to buy UK Power Networks for £10.5bn | Utilities

    A French utility has agreed to buy the owner of the electricity cables and power lines across London, the south-east and the east of England in a deal worth £10.5bn.…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Earn Atmos Rewards elite status with the Atmos Summit card

    Earn Atmos Rewards elite status with the Atmos Summit card

    Retro Marvel collection announced with X-Men arcade game

    Retro Marvel collection announced with X-Men arcade game

    Hockey’s Greg Westlake, nordic skier Mark Arendz, curler Ina Forrest front Canada’s Paralympic team

    Hockey’s Greg Westlake, nordic skier Mark Arendz, curler Ina Forrest front Canada’s Paralympic team

    The White House wants AI companies to cover rate hikes. Most have already said they would.

    The White House wants AI companies to cover rate hikes. Most have already said they would.

    Nike ACG LDV Release Date

    Nike ACG LDV Release Date

    DOJ has not released some Epstein files related to a woman who made an allegation against Trump

    DOJ has not released some Epstein files related to a woman who made an allegation against Trump