Microsoft Teams Secretly Harvested Voiceprints from Millions of Workers
A federal class action reveals that Microsoft collected biometric voiceprints from Teams meeting participants in Illinois without notice, consent, or a privacy policy, violating the state’s landmark biometric data law.
Microsoft Teams, used by 320 million people daily and generating over $8 billion per year, has been secretly extracting biometric voiceprints from meeting participants since 2021. Workers in Illinois joined calls, spoke, and had their unique voice data harvested, stored on Azure servers, and linked to their names and email addresses. Microsoft never told them. Microsoft never asked permission. Microsoft has no public policy describing how long this voice data is kept or when it is destroyed. This is a deliberate and ongoing violation of Illinois’s Biometric Information Privacy Act, a law passed specifically to stop corporations from treating human bodies as data sources.
Your voice is biometric data. Microsoft took it. Demand accountability.
The Allegations: A Breakdown
| 01 | Microsoft extracted biometric voiceprints from Teams meeting participants during live transcription sessions starting in 2021, capturing each person’s unique vocal pitch, tone, and timbre as mathematical vectors comparable to fingerprints or faceprints. | high |
| 02 | Microsoft stored these voiceprints on Azure cloud servers and linked them to meeting participants’ names, profile pictures, email addresses, and organizational affiliations without consent. | high |
| 03 | At no point did Microsoft inform Teams meeting participants in writing that it was collecting biometric identifiers or biometric information during meetings where transcription was active. | high |
| 04 | Microsoft never obtained written consent or a signed release from meeting participants before capturing their voiceprints, as required under all three prongs of BIPA Section 15(b). | high |
| 05 | Microsoft failed to publish any publicly available retention schedule or guidelines for the permanent destruction of voiceprints collected from Teams users, violating BIPA Section 15(a). | high |
| 06 | This conduct continued despite BIPA existing since 2008 and Microsoft having attorneys and compliance teams fully capable of identifying the legal obligations BIPA imposes on biometric data collectors. | high |
| 01 | Microsoft maintains a U.S. State Data Privacy Laws Notice addressing state-specific privacy regulations in California and other states, yet published no equivalent notice for Illinois or BIPA despite operating one of the most widely used platforms subject to that law. | high |
| 02 | The complaint alleges Microsoft’s omission of any Illinois-specific privacy policy is reckless, if not intentional, given the company’s size, legal sophistication, and explicit awareness of biometric privacy litigation nationwide. | high |
| 03 | The only voice-related disclosure in Microsoft’s Privacy Statement refers to optional, opt-in review of raw audio clips for AI improvement, an entirely different process that does not address voiceprint collection during live transcription. | medium |
| 04 | Microsoft provides a specific privacy policy for its corporate Azure AI Speech-to-Text clients, but extended no equivalent protections to the individual users and meeting participants whose voice data fuels the same underlying technology. | high |
| 05 | The “Privacy policy” link shown during Teams transcription sessions links to a statement that never mentions voiceprints, leaving participants with no meaningful disclosure of what is actually being collected from their bodies. | high |
| 01 | Microsoft Teams generates over $8 billion per year in revenue. The live transcription feature, which drives this biometric collection, was introduced specifically to deepen market share and competitive advantage against rival platforms. | high |
| 02 | Voiceprints are a uniquely valuable form of biometric data: unlike passwords or usernames, they are permanent and irrevocable. Microsoft extracted this irreplaceable personal data to power a commercial product without compensating or even notifying the people it harvested. | high |
| 03 | Microsoft’s diarization technology, the system that identifies who spoke and when in a transcript, is commercially valuable precisely because it works at scale across millions of meetings. The Illinois workers who powered this system gave Microsoft nothing less than their biological identity. | medium |
| 04 | Obtaining written consent would have cost Microsoft nothing financially but would have required transparency about a practice it chose to conceal, suggesting the company placed commercial convenience above its legal and ethical obligations to users. | medium |
| 01 | Microsoft gave meeting participants no way to know their voiceprints were being collected, no way to refuse collection, and no way to request deletion, stripping them of every meaningful privacy safeguard BIPA was designed to provide. | high |
| 02 | The complaint notes Microsoft’s failure is especially inexcusable because BIPA compliance has been the subject of major litigation and public legal commentary since 2015, giving Microsoft ample opportunity to build compliant systems before launching transcription in 2021. | high |
| 03 | Microsoft cannot confirm whether voiceprints collected during Teams transcription sessions are ever permanently destroyed, or whether destruction follows the same protocols applied to its corporate Azure clients, leaving individual users in total ignorance about the fate of their biometric data. | high |
| 04 | The class may include tens of thousands of Illinois residents whose voiceprints were collected without consent from March 2021 to the present, spanning years of ongoing BIPA violations with no corrective action taken by Microsoft. | medium |
| 05 | Plaintiffs seek injunctive relief to force Microsoft into compliance with BIPA’s mandates, in addition to statutory damages of $1,000 per negligent violation and $5,000 per willful or reckless violation, meaning total liability could reach billions of dollars given the scale of the class. | medium |
| 01 | The five named plaintiffs, Illinois residents who used Teams in transcribed meetings for work or school, represent tens of thousands of individuals who participated in Microsoft Teams meetings unaware their biological voice signature was being captured and stored. | high |
| 02 | Microsoft Teams is used by workplaces, schools, government agencies, families, and friends. This means voiceprint collection extended to teachers, students, government employees, and others who had no choice but to participate in transcribed meetings. | high |
| 03 | Because voiceprints are biologically permanent, unlike a compromised password or credit card number that can be reissued, every affected participant faces a permanent risk: their biometric identity, once exposed or misused, cannot be replaced. | high |
| 04 | Non-Microsoft account holders who joined Teams meetings as guests were also subject to voiceprint capture and identification, extended to the names they provided at entry and sometimes their email addresses, with no account relationship giving them any expectation of such surveillance. | medium |
Timeline of Events
Direct Quotes from the Legal Record
“Biometrics, however, are biologically unique to the individual; therefore, once compromised, the individual has no recourse, is at heightened risk for identity theft, and is likely to withdraw from biometric-facilitated transactions.”
This is the foundation of why BIPA exists. Microsoft, knowing this language from 15 years of legal precedent, extracted permanent biometric data from millions of people without asking once.
“These voiceprints capture the distinct vocal characteristics of the individual, including for example, their specific pitch, tone, and timbre. This information, stored as a series of numerical vectors, is unique to the individual and is akin to fingerprint or a faceprint.”
Microsoft was not collecting abstract audio data. It was extracting the biological equivalent of your fingerprint from your voice, silently, every time you spoke in a transcribed Teams meeting.
“Microsoft does not, at any time, (1) inform Microsoft Teams meeting participants that it collects, obtains, uses, or generates voiceprints or any other biometrics from them as part of its transcription process.”
Zero disclosure. Zero notice. Every worker, student, and citizen who spoke in a transcribed Teams meeting was subjected to biometric extraction in complete silence from Microsoft.
“In the linked Microsoft Privacy Statement, however, Microsoft does not disclose that it obtains voiceprints during the live transcription of Microsoft Teams meetings, purport to gain the user’s consent for the same, or even mention voiceprints at all.”
Microsoft’s own privacy policy, the document it offered as a substitute for real disclosure, says nothing about voiceprints. The link shown in the transcription interface led to a document that did not reflect what was actually happening.
“There is no justification for Microsoft’s failure to maintain any privacy policy specific to Illinois, Illinois residents, or BIPA. To the contrary, the omission is reckless, if not intentional.”
The complaint does not use the word reckless lightly. Microsoft covers California. It covers other states. Illinois, whose biometric privacy law is the most consequential in the country, got nothing.
“Even though BIPA was created in 2008 and BIPA compliance (or lack thereof) has been the subject of significant litigation and legal commentary, Microsoft continues to violate BIPA by providing live transcription services in its Microsoft Teams platform through the unauthorized collection of Teams meeting participants’ voiceprints.”
This is not an oversight from before the law existed. BIPA is 17 years old. Microsoft chose to build and operate a biometric collection system without complying with a law it had every reason and opportunity to know about.
Commentary
💡 Explore Corporate Misconduct by Category
Corporations harm people every day — from wage theft to pollution. Learn more by exploring key areas of injustice.
- 💀 Product Safety Violations — When companies risk lives for profit.
- 🌿 Environmental Violations — Pollution, ecological collapse, and unchecked greed.
- 💼 Labor Exploitation — Wage theft, worker abuse, and unsafe conditions.
- 🛡️ Data Breaches & Privacy Abuses — Misuse and mishandling of personal information.
- 💵 Financial Fraud & Corruption — Lies, scams, and executive impunity.