Science

Digital sovereignty meets land rights: the fight behind AI scraping

At UNPFII, Indigenous leaders warned that violence against land defenders is spilling into digital spaces—where AI can extract knowledge without consent. Misryoum reports on the push for Indigenous data sovereignty and stronger protections for Indigenous women

At the UN Permanent Forum on Indigenous Issues this week, Indigenous leaders made one point repeatedly: today’s struggle for health and sovereignty isn’t confined to forests, rivers, or coastlines—it’s moving into digital spaces where knowledge can be copied, sold, and rewritten without permission.

The 25th session of the UNPFII. themed “Ensuring Indigenous Peoples’ Health in the context of conflict. ” brought together communities to confront a widening crisis.. Indigenous land defenders. speakers said. are being killed. criminalized. and pushed into hiding. while parallel threats are emerging through surveillance. legal pressure. and the growing extraction of Indigenous knowledge online.

Albert K.. Barume. the UN Special Rapporteur on the rights of Indigenous Peoples. described a pattern that links weak land protections to escalating violence—often alongside detention and criminal charges.. Misryoum notes that the forum’s focus on health underscored a familiar reality for many communities: the right to live safely is tied to the right to stay on land that is recognized. protected. and governed by Indigenous nations themselves.

A central theme across the discussions was how land tenure and natural resources can turn conflict into policy.. Speakers argued that when governments or companies pursue projects without consent. Indigenous opposition is frequently treated as criminal behavior rather than political rights or environmental defense.. In several regions, this framing helps normalize impunity and discourages reporting and accountability.

Women’s safety, in particular, was repeatedly raised as inseparable from land and conflict.. Participants highlighted how displacement, climate stress, and extractive industries compound vulnerabilities—especially for Indigenous women and girls.. Misryoum also observes that this is not just a human-rights issue at the level of individual violence; it reverberates through communities’ healthcare access. education pathways. and long-term wellbeing.

Alongside the physical risks, the forum placed new emphasis on digital extractivism.. As generative AI expands. Ibrahim—described as a former chair of the forum—warned that AI systems can scrape traditional stories. medicinal knowledge. and cultural motifs from the internet without consent.. The concern is twofold: first. that communities lose control of knowledge that carries identity and lineage; and second. that bias can be baked into the systems themselves when Indigenous people are underrepresented in training data.

Misryoum sees this as a modern echo of an older pattern.. Where extractive industries have historically treated land as a commodity. AI platforms can treat cultural and linguistic knowledge as raw material—collected. processed. and monetized—often without collective authorization.. Even when the output appears “informational,” the inputs can reflect a failure to respect consent, ownership, and cultural boundaries.

To counter that, leaders discussed the push for Indigenous data sovereignty.. Instead of an “open data” model that assumes information should circulate freely. Indigenous frameworks argue that collective rights must govern how knowledge is stored. accessed. and used.. The forum pointed to existing examples of community-led approaches. including efforts in Aotearoa New Zealand to build language resources under Māori control.

Speakers also discussed ethical governance frameworks designed for AI-era decisions. including the CARE Principles—Collective Benefit. Authority to Control. Responsibility. and Ethics—and Canada’s OCAP model. which focuses on Ownership. Control. Access. and Possession.. Misryoum highlights the key distinction behind these frameworks: they treat data as connected to governance. not as neutral content that can be managed only by technical systems.

In practice, this governance translates into who sets the rules.. The forum heard that Indigenous-run programs and Māori data governance models aim to ensure data initiatives are values-led and shaped by community priorities.. For many speakers, the principle isn’t merely procedural; it’s spiritual and social.. One theme captured the idea that data can function as whakapapa—lineage—meaning it carries responsibilities that cannot be reduced to a dataset.

The stakes, Indigenous advocates stressed, are rising as AI becomes embedded in services, research, and technology products.. If communities lack authority over the data that represents them. they can face digital versions of the same imbalances seen offline: misrecognition. cultural distortion. and economic or political leverage held by outsiders.

The forum also addressed how systemic discrimination affects Indigenous girls and women through education and social services. including barriers to culturally appropriate schooling.. For Misryoum. this connection matters: health is often treated as medical care. but at UNPFII it was described as an ecosystem—land. safety. language. education. and governance.. Without those supports, risks intensify.

As the session drew attention to CEDAW recommendations specifically concerning Indigenous women and girls. participants argued that international standards only matter when implemented in ways that shift power on the ground.. Some speakers urged UN mechanisms to press states to protect Indigenous women’s rights more effectively. while others emphasized the complexity of debates within communities about how colonization shapes gendered harms.

The message coming out of UNPFII was blunt: violence and criminalization are not separate problems from technology governance.. Misryoum reports that the fight for health and sovereignty now includes control over knowledge itself—who can access it. who benefits from it. and whether consent is treated as a requirement or an obstacle.