Otter.ai has 25 million users and has processed over 1 billion meetings. It's one of the most popular transcription tools in the world. But in August 2025, a federal class action lawsuit put Otter.ai privacy practices under a microscope — and what it revealed should concern anyone who's ever used the service.
If you've been wondering "is Otter.ai safe?" — this article breaks down exactly what happens to your data, the details of the lawsuit, and what alternatives exist for people who want their meetings to stay private.
What Does Otter.ai Do With Your Data?
Before we get into the lawsuit, let's understand how Otter.ai actually works — because Otter.ai data security starts with understanding the data flow.
Your Audio Goes to Otter's Servers
When you use Otter.ai, here's what happens:
- You join a Zoom, Google Meet, or Microsoft Teams call
- Otter records the audio and sends it to their cloud servers
- Their AI processes the audio into text on those servers
- Your transcript is stored on Otter's infrastructure
- Your data may be used to train and improve Otter's AI models
That last point is the critical one. Otter's own privacy policy states that users grant Otter and third parties permission to use private conversations "for training and product improvement purposes" — via a checkbox that many users overlook.
Otter Joins Meetings Without Everyone's Consent
Here's where it gets uncomfortable. If someone in your meeting has an Otter account linked to their calendar, Otter's "Notetaker" bot can automatically join the meeting. The bot typically asks the host for permission to record — but it does not ask all the other participants.
That means you could be in a confidential meeting right now, being recorded by Otter, without knowing it. Your words are being transcribed, uploaded to Otter's servers, and potentially used to train AI models — and nobody asked you.
Third-Party Data Sharing
Otter.ai shares user data with third parties. This raised serious concerns when Politico's China correspondent reported using Otter to interview a Uyghur human rights activist, then discovering that Otter shares data with third parties — creating the theoretical possibility that a foreign government could attempt to access raw transcriptions of conversations with dissidents.
Otter has said it does not share data with foreign governments. But the structural risk remains: once your data is on someone else's servers, you no longer control who accesses it.
The Otter.ai Class Action Lawsuit (2025)
In August 2025, NPR reported on a federal lawsuit seeking class-action status against Otter.ai. The case was filed in the U.S. District Court for the Northern District of California.
What the Lawsuit Alleges
The plaintiff, Justin Brewer of San Jacinto, California, alleges that Otter.ai:
- "Deceptively and surreptitiously" recorded private conversations without proper permission from participants
- Used those recordings to train its AI without informed consent from the people being recorded
- Failed to alert meeting participants that their recordings would be shared with Otter to improve its artificial intelligence systems
- Violated state and federal privacy and wiretap laws through these covert recording practices
Brewer alleges his privacy was "severely invaded" upon realizing Otter was secretly recording a confidential conversation. The suit seeks to represent others in California who have had conversations unknowingly shared with Otter.
This Isn't an Isolated Incident
The lawsuit didn't come out of nowhere. Users have been raising Otter.ai privacy concerns for years:
- The investor meeting disaster (2024): An AI researcher reported that Otter recorded a Zoom meeting with investors and sent him a transcription that included "intimate, confidential details" discussed after he had left the meeting. Those details killed the deal, as reported by The Washington Post.
- Automatic meeting recording: Reddit users have complained about Otter joining meetings automatically when linked to workplace calendars, recording conversations without consent.
- The journalist concern: Politico's China correspondent raised alarm about the implications of Otter's data sharing for journalists working with vulnerable sources.
What This Means for You
If you've used Otter.ai — or if anyone in your meetings has — your conversations may have been:
- Recorded without your explicit consent
- Uploaded to Otter's cloud servers
- Used to train AI models
- Potentially accessible to third parties
Even if Otter argues that users consented via a checkbox in their privacy policy, the people on the other end of those meetings never consented to anything.
The Deeper Problem: Cloud Transcription and Privacy
The Otter.ai lawsuit isn't just about one company behaving badly. It highlights a fundamental problem with cloud-based transcription: your audio must leave your device.
Every cloud transcription service — Otter.ai, Fireflies.ai, Rev, Trint — requires your audio to travel over the internet to remote servers. Once it's there:
- You're trusting their security practices
- You're trusting their data retention policies
- You're trusting their employees won't access it
- You're trusting they won't be breached
- You're trusting they won't change their terms of service
That's a lot of trust for a meeting about your company's financial projections or your client's legal case.
Who's Most at Risk?
- Lawyers: Attorney-client privilege means nothing if your meeting audio lives on a third party's server that can be subpoenaed
- Healthcare professionals: HIPAA compliance becomes a nightmare when patient information is processed in the cloud
- Journalists: Source protection is compromised when recordings exist on servers subject to legal process
- Business leaders: NDA-covered discussions, M&A talks, HR conversations — all vulnerable
- Everyone: Your private conversations deserve to stay private, period
The Alternative: Transcription That Never Leaves Your Device
The simplest way to eliminate cloud privacy risks is to remove the cloud entirely. Offline transcription apps process your audio directly on your phone or laptop — nothing is ever uploaded, transmitted, or stored on external servers.
How Offline Transcription Works
Modern AI models like OpenAI's Whisper have been optimized to run on mobile devices. Your phone's neural processing chip handles the entire transcription locally. The result:
- Audio stays on your device
- No internet connection required
- No servers to breach
- No data to subpoena
- No training on your conversations
Viska: The Private Otter.ai Alternative
Viska was built specifically for people who want meeting transcription without the privacy trade-offs. Here's how it compares:
| Feature | Otter.ai | Viska |
|---|---|---|
| Audio processing | Cloud servers | On your device |
| Internet required | Yes | No |
| Data used for AI training | Yes (per privacy policy) | No — impossible by design |
| Who can access your data | Otter, third parties | Only you |
| AI summaries | Yes (cloud) | Yes (on-device via Llama 3.2) |
| Price | $16.99/mo (Pro) | $6.99 once (iOS) / $4.99 once (Android) |
| Class action lawsuits | Yes (2025) | None — no data to misuse |
Viska uses OpenAI's Whisper model running locally on your phone to transcribe meetings, then uses an on-device LLM to generate summaries and action items. Your audio literally cannot leave your device — there's no server to send it to.
It's a one-time purchase ($6.99 on iOS, $4.99 on Android). No subscription. No account required. No data collection.
Download Viska →How to Protect Your Meetings Right Now
Whether or not you switch apps, here are steps to improve your meeting privacy today:
1. Audit Your Current Tools
Check if anyone on your team uses Otter.ai or similar cloud transcription bots. Review what permissions they've granted.
2. Check Meeting Settings
In Zoom, Google Meet, and Teams, you can restrict who can record meetings and require consent from all participants.
3. Ask About Recording
At the start of any meeting, ask: "Is anyone recording this?" It's a reasonable question that most people will answer honestly.
4. Switch to Offline Transcription
For confidential meetings, use an app that processes audio locally. Viska, WhisperNotes, or other offline transcription apps eliminate the cloud risk entirely.
5. Review Privacy Policies
If you must use cloud services, actually read their privacy policies. Understand what happens to your data and whether it's used for AI training.
Frequently Asked Questions
Is Otter.ai safe to use?
Otter.ai's safety depends on your definition. The app functions well as a transcription tool, but it sends your audio to cloud servers, uses data for AI training (per its privacy policy), and was the subject of a 2025 class action lawsuit alleging it recorded conversations without proper consent. For confidential meetings, offline alternatives like Viska are objectively safer because your data never leaves your device.
What happened in the Otter.ai class action lawsuit?
In August 2025, a federal lawsuit was filed against Otter.ai in the Northern District of California, alleging the company "deceptively and surreptitiously" recorded private conversations and used them to train AI without proper consent from all participants. The suit claims violations of state and federal privacy and wiretap laws.
Does Otter.ai use my recordings to train AI?
Yes. Otter.ai's privacy policy states that users grant permission for their data to be used "for training and product improvement purposes." The 2025 lawsuit alleges that many users — and particularly non-Otter-users in the same meetings — were not properly informed of this.
What is the best private alternative to Otter.ai?
For maximum privacy, use an offline transcription app that processes audio entirely on your device. Viska offers transcription plus AI summaries, all running locally with no internet required. It's a one-time purchase with no subscription, and your data physically cannot leave your device. See our full comparison of meeting transcription apps that don't require a subscription.
Can Otter.ai record me without my permission?
If someone in your meeting has Otter integrated with their calendar, the Otter Notetaker bot can join meetings automatically. While it typically asks the meeting host for permission, it does not by default ask all other participants. This means you could be recorded without your knowledge or explicit consent.
Is Otter.ai HIPAA compliant?
Otter.ai offers a Business plan with a BAA (Business Associate Agreement) for healthcare use. However, the fundamental architecture — sending patient audio to cloud servers — creates inherent risks. Offline transcription apps eliminate the need for a BAA entirely because protected health information never leaves your device.
Keep your meetings private
Viska — Privacy-first transcription with AI summaries. One-time purchase. No subscriptions. Your audio never leaves your device.
Download Viska →