AI meeting tools might be time savers but beware their risks: privacy experts


Someone may be listening in on your next meeting — and it’s not your micromanaging boss a few desks over, nor your spouse or kids across the room.

Artificial intelligence-based tools like Fireflies.ai, Otter.ai, Trint and Fathom are increasingly being used to record, transcribe and pump out summaries of meetings.

Sometimes their presence is overt — another participant panel on a video conference screen or a flashing recording indicator in the corner — but other times, users have them quietly running in the background.

The wide gamut of tools and lack of transparency sometimes surrounding them are contributing to a privacy minefield, experts say.

“We’re sort of entering this phase where I don’t think you can go a week without hearing a news story around something that’s gone wrong with AI or some data breach somewhere,” said Nicolas Joubert, a Winnipeg-based partner at law firm MLT Aikins.

Still, AI notetakers, transcription and summary tools have exploded in popularity in recent years with boards, doctors and young professionals becoming some of the most fervent adopters.

They like the technology because it takes some of the drudgery out of meetings, allows participants to focus more on the conversation at hand and later, reduces the time it takes to recap what happened.

Kael Campbell, president of Red Seal Recruiting Solutions Ltd., said his Victoria, B.C.-based hiring firm has used interviewing platform HoneIt for about four years. He likes that the transcripts it produces are often more comprehensive than his own notes.

“I would not make full verbatim notes and now, we have full transcripts, so if I had a client ask about very specific stuff, I was able to go back,” he said.

But the tools have just as many risks as perks. Experts say they often create a huge volume of data — sometimes riddled with mistakes and personal information — and have a whole host of privacy issues.

The risks begin with what gets recorded. These tools don’t know the difference between the typical small talk about the weather, hobbies or politics that punctuates meetings and the actual substance of a conversation. Thus, they capture and summarize unnecessary but often very personal details, Joubert said.

They also can’t tell when a meeting goes “in camera” — a term used to describe private discussions, like some board and executive meetings or portions of court proceedings, that should not be taped — and keep recording.

“All of a sudden, all those in-camera discussions have just been distributed to the entire meeting mailing list and that’s obviously a huge problem,” said Teresa Scassa, Canada Research Chair in information law and policy at the University of Ottawa.

Sometimes the notes aren’t accurate either. When the sound isn’t great or systems aren’t familiar with words, the tools guess what was said, rather than noting that part of the meeting was inaudible.

They can also hallucinate — a term used to describe when AI systems generate material that is untrue.

The potential problems stretch beyond the performance of these tools.

Data captured by their products can be used to train new AI models and often recordings and transcriptions get stored in the cloud, making them vulnerable to leaks and breaches.

“Is it storing data outside of the country? Is it processing data and selling it to third parties, on a non-aggregated, identified basis? Is it using your data to train other models? Do other parties or customers or subscribers or end users get access to your data when it is incorporated in the system?” said Joubert.

“There’s all sorts of questions you have to ask yourself and get comfortable with the answers before you should really think about implementing these systems.”

When something goes wrong, it can be news to meeting participants that a recording was even made. In some cases, they only find out when their information was contained in an email blast or posted online, said Scassa.

She points to a September 2024 example at an Ontario hospital.

A letter the province’s information and privacy commissioner posted online last year highlighted a virtual meeting between doctors at the unnamed hospital to discuss seven patients.

One of the invited doctors had left the hospital in June 2023. Because he was using his personal email for the meeting and had installed Otter.ai on his personal device, the transcription service was able to access the meeting and record it without any attendees noticing.

The inadvertent recording was discovered later when Otter.ai automatically emailed a meeting summary and transcript of the conversation about patients to 65 invitees.

The hospital reported the breach to Ontario’s privacy commissioner, demanded recipients delete the missive and updated its policies around the tools.

Joubert and Scassa agree what happened at the hospital should be a lesson for anyone using AI meeting tools.

They recommend users review their settings carefully, so they’re not inadvertently recording and distributing material and say whenever someone uses an AI tool, they should notify other participants and take any of their qualms into consideration.

Red Seal Recruiting Solutions’ clients aren’t bothered by AI tools, but some want the notes deleted later. If anyone had an objection, Campbell said the company would revert to typing or handwriting notes, even though it would take more time.

His staff notify meeting participants whenever meeting tools are being used and verify any output for accuracy before disseminating it.

Before implementing the software, the company looked closely at the terms and verified they comply with not just Canadian legislation but also European regulations, which are the world’s strictest.

Joubert advises people and companies to pore through these agreements to understand what kind of access and rights they’re granting the software and what can go wrong.

“Perhaps not surprisingly, a lot of the big vendors will tend to want to push a lot of that risk and responsibility to the customer,” said Joubert.

While he sees the value in AI meeting tools and knows a lot of what gets said in meetings may seem innocuous, he reminds people there’s a danger whenever that info can be accessed by someone else or combined with other details from social media or data leaks.

“You probably wouldn’t want to stand on a busy street corner with a sandwich board with that information during rush hour,” he said. “It’s really no different.”

This report by The Canadian Press was first published March 1, 2026.

Tara Deschamps, The Canadian Press



Source link

  • Related Posts

    Carney says Canada won’t be involved militarily in U.S.-Israel action in Iran

    OTTAWA — Prime Minister Mark Carney says Canada will not participate in the ongoing conflict in Iran. This from Carney comes following the United States and Israel launching an attack…

    Trump talks regime change in Iran after strikes, but history shows that could be very hard

    Barely an hour after the first U.S. and Israeli missiles struck Iran, President Donald Trump made clear he hoped for regime change. “Now is the time to seize control of…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Iran’s Attacks on Persian Gulf Countries Crack Their Safe Haven Image

    US moving pregnant immigrant girls to Texas to avoid providing abortions, critics say | US immigration

    US moving pregnant immigrant girls to Texas to avoid providing abortions, critics say | US immigration

    Honor’s Magic V6 is the first foldable with an IP69 rating

    Honor’s Magic V6 is the first foldable with an IP69 rating

    South Africa go into semis unbeaten despite Raza's heroics

    OPEC+ to Resume Oil Output Increases as Iran Conflict Rages

    OPEC+ to Resume Oil Output Increases as Iran Conflict Rages

    Rashida Jones says Pixar’s Andrew Stanton gave her the best writing advice

    Rashida Jones says Pixar’s Andrew Stanton gave her the best writing advice