As reported on Aug. 6, Zoom recently attempted to rewrite its Terms of Service with ambiguous language that would permit the extraction of user data for the purpose of training AI.
However, after public pushback, Zoom began to rectify that clause the very next day, fully committing to a “no AI training” set of policies by Aug. 11.
Even though Zoom pedalled back this time, their drive to gather data highlights the possibility of future hidden data extraction by them and other big tech companies.
More specifically, as a researcher working with and looking at Indigenous communities and their data, I am concerned about the privacy of these valuable data sets from Indigenous communities on Turtle Island.
Vulnerable Indigenous Knowledge
Over the past three years, Zoom calls have become a tool for organization and activism for many Indigenous communities.
For my own work, I use video and voice chat which lets us balance geographical differences to collaborate and share, as well as access communities that are hard to reach. Discussing issues with queer community members of different Indigenous Nations is often private and perhaps even sacred.
These conversations have elements that are public facing, but they also contain wisdom from Elders or Knowledge Keepers specifically trained to know what they can and cannot share in specific spaces. Some of this knowledge is sacred and is part of promoting and preserving Indigenous (and sometimes queer) ways of being.
A valuable commodity
This private information is constantly at risk of extraction from companies seeking to monetize or otherwise gain from our data.
Indigenous Knowledge represents a large gap in current big data. AI only works with large data sets which enables predictive technology to operate.
With knowledges that are primarily oral, it is difficult to gather proper data sets that often come from writing. The possibility for big companies to gather audio and visual data, could render this oral information visible by machines.
Protecting communities
“Refusing research” has been an important concept for protecting marginalized communities from the extractive practices of researchers aiming to obtain data.
However, if platforms are extracting data without our knowledge, or demand our consent in order to use a service, a conflict emerges.
The conflict becomes one of free choice versus free-to-leave: If we do not consent to use the infrastructure, we simply do not get access to that service. Access to voice and video sharing infrastructure has been a fundamental component of activism and community research, especially post COVID-19.
Can we ‘opt-out?’
Can we accept or refuse to be turned into research data?
Even though there is a permissions element, organizations are often gathering our data in exchange for using their services. For example, Fitbit gathers massive amounts of health data from users (with permission) that can be used to train AI.
Each individual who is opting for nearly any big service is being tracked to some capacity. And so, there needs to be a critical element of what is considered private.
Likewise, Zoom has the ability to gather this data, whether or not they use it for AI with consent. There is an anxiety that next time, the ambiguity will go unnoticed or perhaps force consent to access a seemingly necessary service.
As someone who looks at ethical data collection and mobilization, I believe we all need to be critical of those requests to have access to our private data when using these services.

In the future, will we find ourselves agreeing to give up our data just to use video platform software like Zoom? (Unsplash)
Crucial access to data
The relationship between data and Indigenous communities and the Canadian government has always been fraught. However, after the work of the Truth and Reconciliation Commission in Canada (which concluded in 2015), it became even more clear that access to data and information is crucial to achieving justice and truth in relation to our histories.
For Indigenous peoples whose history has been systematically erased, demanding that organizations return records and data has become an important element of achieving the truth behind the experiences of Indian Residential School survivors. Communities have both the desire and need to have their data returned so that they can maintain ownership, control, access and manage permissions to access information.
Ease of Zoom for communication
In-person collaboration between Indigenous communities can be difficult because of things like geographical differences, the lack of public transportation, and interruptions in Indigenous sovereignty. These issues continue the social and political fragmentation caused by settler colonialism to isolate these communities from one another.
Many of these challenges have been alleviated by information technologies like Zoom. And a platform like Zoom has been potentially unifying by bridging space. However, it could also become a tool to recreate the problem of data extraction in a new way.
We need to be attentive to these kinds of data gathering possibilities that offer to extract data from users.
These technological infrastructures may disproportionately harm Indigenous communities by making their private and sacred knowledges legible by AI. Data collection for AI could lead to the commodification of this sacred knowledge for profit.
Protecting this kind of data is not just the responsibility of Indigenous communities but a shared commitment that has a present and future urgency.


Anthropic Reportedly Taps Wilson Sonsini as It Prepares for a Potential 2026 IPO
Baidu Cuts Jobs as AI Competition and Ad Revenue Slump Intensify
Apple Alerts EU Regulators That Apple Ads and Maps Meet DMA Gatekeeper Thresholds
Samsung Launches Galaxy Z TriFold to Elevate Its Position in the Foldable Smartphone Market
OpenAI Moves to Acquire Neptune as It Expands AI Training Capabilities
Wikipedia Pushes for AI Licensing Deals as Jimmy Wales Calls for Fair Compensation
ByteDance Unveils New AI Voice Assistant for ZTE Smartphones
Sam Altman Reportedly Explored Funding for Rocket Venture in Potential Challenge to SpaceX
Norway’s Wealth Fund Backs Shareholder Push for Microsoft Human-Rights Risk Report
Banks Consider $38 Billion Funding Boost for Oracle, Vantage, and OpenAI Expansion
Firelight Launches as First XRP Staking Platform on Flare, Introduces DeFi Cover Feature
Intel Boosts Malaysia Operations with Additional RM860 Million Investment
TSMC Accuses Former Executive of Leaking Trade Secrets as Taiwan Prosecutors Launch Investigation
Trump Administration to Secure Equity Stake in Pat Gelsinger’s XLight Startup
Microchip Technology Boosts Q3 Outlook on Strong Bookings Momentum
Apple Leads Singles’ Day Smartphone Sales as iPhone 17 Demand Surges 



