Infrastructural Fascism:
The Supermarket Panopticon



PARERGON.
ISSUE #01
November 2025
tom wilson

Late techno-capital embeds power within everyday infrastructure. Control now operates through an apparatus that is both technical and totalising. Artificial intelligence, data surveillance, and algorithmic management now constitute the architecture of governance. Democracy erodes not through overt force alone but through the quiet systems that administer the world.

This transformation demands critical urgency. Across the globe, democratic institutions corrode as techno-authoritarian and neo-fascist formations consolidate. The rise of neo-Trumpism in the United States and Israel’s settler-colonial genocide in Palestine exemplify this convergence: data colonialism, militarised xenophobia, and automated surveillance fused into a single infrastructure of domination. In Australia, this logic manifests through the normalisation of colonial data collection, systemic racism, land dispossession, and the state’s complicity with war and genocide—policies that privilege control, extraction, and obedience over transparency or justice. The same infrastructures that enable imperial human rights abuses abroad organise the governance of everyday life at home.

As Hannah Arendt warned, totalitarianism does not arise solely through charismatic tyranny but through the slow, invisible consolidation of control—“the process is much less visible and slower, but not less effective” (Coeckelbergh 2022: 113–114). The global and domestic conditions now unfolding reflect precisely this quiet movement: an organisation of power through data and surveillance. The accumulation of wealth and authority endures through these infrastructures. Corporations and oligarchs extend their reach by converting information into discipline and governance. Technological control no longer merely serves capital—it has become capital’s most efficient form. As Foucault observed, once labour becomes cooperative, capital internalises supervision as its own function. Observation ceases to be a managerial activity and becomes structural to production itself. Visibility becomes power; power becomes profit.

No company more fully embodies the convergence of surveillance, capital, and governance than Palantir Technologies—“among the most secretive and understudied surveillance firms globally” (Iliadis & Acker 2022: 2). Founded in 2003 with early funding from the CIA’s venture arm In-Q-Tel, Palantir has become the paradigmatic firm of the surveillance economy: a private contractor that fuses military intelligence, corporate management, and algorithmic prediction into a single infrastructure of control.

Palantir’s software integrates vast public and private data sources into predictive systems that transform social behaviour into targets of intervention. In the United States, this has taken the form of predictive policing and immigration enforcement. Its platforms are used by police departments accused of racial profiling and by agencies such as the FBI and Immigration and Customs Enforcement (ICE) “to digitally track undocumented immigrants, including children, for deportation” (Iliadis & Acker 2022: 3). A 2019 workplace raid in Mississippi, in which 680 migrant workers were arrested, was “carried out by the unit of ICE that uses Palantir software to investigate potential targets and compile evidence against them” (Iliadis & Acker 2022: 3).

As Mijente’s investigation of Palantir’s contracts with ICE reveals, “the Trump administration’s war on migrants has been turbocharged by big tech,” arming agencies “with military-grade digital tools they need to commit atrocities along the southern border and the interior” (Mijente 2019: 2). Palantir’s Investigative Case Management (ICM) system—described internally as “mission critical” to ICE’s operations—enables agents to target and arrest family members of unaccompanied children (Mijente 2019: 3). The platform’s surveillance capacities are expansive, encompassing “Facebook, WhatsApp, and Twitter activity, as well as historical and live telephone call monitoring, SMS and multimedia text monitoring, [and] web account login information” to physically track individuals (Mijente 2019: 8–10).

While marketed as a “precision tool,” the company’s systems facilitate dragnet operations. During one ICE campaign against so-called “transnational criminal networks,” 634 of 1,416 arrests were for non-criminal immigration violations (Mijente 2019: 10). The racial dimension of this infrastructure is not incidental but structural: the digital continuation of the bureaucratic violence that powered colonial and fascist regimes. “Today, IBM’s work for Nazi Germany is universally condemned,” Mijente writes. “The same is happening now within tech firms across the country. Saying ‘never again’ means recognising historical analogies and opposing human rights abusers—and their enablers—at every step” (Mijente 2019: 5).

The parallels extend outward, linking domestic carcerality to complicit genocide. In 2024, Palantir signed an “upgraded agreement with Israel’s Ministry of Defense” to “harness [its] advanced technology in support of war-related missions” (Business & Human Rights Resource Centre 2024). The company publicly declared, “We stand with Israel … our work in the region has never been more vital, and it will continue.” Through its AI-driven targeting systems, Palantir provides the data architecture that enables the Israeli military’s algorithmic warfare programs—platforms that calculate, identify, and authorise lethal strikes with minimal human oversight. These systems have supplied “advanced and powerful targeting capabilities—the precise capabilities that allowed Israel to place three drone-fired missiles into three clearly marked aid vehicles” (Business & Human Rights Resource Centre 2024).

This is not an abstract ethical dilemma but the automation of war crimes. The capacity to detect, classify, and annihilate has been absorbed into autonomous systems: a technocratic reduction of life and death to data points. Palantir’s chief executive Alex Karp has openly admitted that “our product is used on occasion to kill people,” a statement that distils atrocity into corporate banality (Business & Human Rights Resource Centre 2024).

These infrastructures see, predict, and decide, automating the processes through which life is rendered visible and governable. Palantir materialises what Achille Mbembe, citing Eyal Weizman, identifies as the fusion of surveillance and weaponry within colonial space: “Under the conditions of late modern colonial occupation, surveillance is oriented both inwardly and outwardly, the eye acting as weapon, and vice versa… multiple separations, provisional boundaries, which relate to each other through surveillance and control” (Weizman in Mbembe 2019: 53). In Palestine, this logic manifests as algorithmic ethnic cleansing; in the United States, as automated racial profiling. Both extend the same optical regime of domination—one in which data itself becomes the architecture of power.

As Mark Coeckelbergh warns, “digital technologies offer new means for surveillance and manipulation, which may support, or lead to, totalitarianism. AI is one of these technologies... It can help to create a particular kind of surveillance and control: total surveillance and total control” (Coeckelbergh 2022: 113–114). The totalising scope of Palantir’s infrastructure realises this condition: a system in which “everyone will be fully analysed and accounted for. Their every action monitored, their every preference known, their entire life calculated and made predictable” (Bloom 2019, cited in Coeckelbergh 2022: 113). Palantir thus operationalises what Coeckelbergh calls “a digital version of Big Brother,” extending the panoptic logic of power into a planetary architecture of data (Coeckelbergh 2022: 113–114).

“AI and data science,” Coeckelbergh continues, “may thus become instruments of new forms of totalitarianism, in which AI knows us better than—and before—we do” (2022: 113–114). Palantir’s predictive analytics exemplify this dynamic: knowledge precedes event, anticipation replaces judgment, and governance becomes algorithmic pre-emption. In this regime, “the balance of power is slowly tilted into the hands of a few powerful actors—whether in government or in corporations” (Coeckelbergh 2022: 113–114).

“Technology thus risks becoming—to pick up a phrase by Arendt—one of the origins of totalitarianism” (Coeckelbergh 2022: 114). Palantir embodies this risk: an algorithmic state within the state, where the governance of populations is delegated to machines that learn, anticipate, and decide without accountability. “AI is more than a tool,” Coeckelbergh reminds us; “it is a game changer. As it is applied to the political field, it also changes that field… contributing to the creation of a totalitarian dynamic” (2022: 113–114).

Through such technologies, the architecture of domination becomes both global and banal. Palantir does not merely serve state power—it constitutes its new technical form. Its infrastructures extend the frontier of data colonialism and the reach of what can only be described as infrastructural fascism: the seamless fusion of military, corporate, and algorithmic power that transforms visibility into domination and normalises atrocity as administration.

Coles’ partnership with Palantir marks the implementation of military surveillance infrastructures into the everyday governance of Australian labour. The U.S. war and intelligence contractor—whose software underpins drone targeting and deportation regimes—now powers the surveillance of Australian retail workers. The same infrastructures being used to identify targets for destruction or detention are repurposed to oversee every Coles grocery store in the country.

This continuity of military, corporate, and computational power demonstrates how domination adapts rather than relocates. Surveillance capitalism and security rationalities both operate within platform governance, organising both warfare and work through the same data infrastructures. Coles embodies this formation. A corporation that has long accumulated wealth through exploitation now extends its reach through algorithmic systems of control, embedding extraction and discipline into the daily operations of labour. The partnership exposes a unified regime of visibility and management—one that links the battlefield, the marketplace, and the workplace within a shared architecture of power.

These developments extend a long pattern of structural exploitation within Australia’s supermarket duopoly. The 2025 ACCC Supermarkets Inquiry found that Coles and Woolworths operate within a highly concentrated oligopoly, holding dominant control over suppliers, workers, and consumers alike (ACCC 2025: 1). Both corporations were identified as among the most profitable supermarket businesses globally, having increased product margins for branded household and packaged goods over the last five years (ACCC 2025: 1). This profitability is sustained through monopsony power over suppliers—particularly fresh produce growers—who “lack the information or certainty they need to make efficient investment decisions” and are often forced to absorb additional costs and risk (ACCC 2025: 1).

Such economic domination is inseparable from labour exploitation. Between 2017 and 2020, Coles underpaid 8,700 workers by between A$115 million and A$250 million (FWO 2021; Reuters 2025). Earlier, a 2014 workplace agreement left around 77,000 employees worse off by cutting penalty rates (Fair Work Commission 2016; The Age 2017). In 2021, 350 warehouse workers were locked out after striking over automation (SMH 2021; The Guardian 2021), while investigations continue to expose the exploitation of migrant labour across Coles’ supply chains (FWO 2018; ABC 2019).

The 2025 ACCC report also emphasises that Coles and Woolworths’ promotional and pricing practices obscure value and manipulate consumer perception: over half of all products are sold on promotion, while complex discounting and loyalty programs make it “difficult for consumers to assess value for money” (ACCC 2025: 2). In remote areas, their monopolistic control exacerbates inequality—prices are higher, choices fewer, and the absence of competition entrenches dependence (ACCC 2025: 2).

Taken together, these findings reveal Coles not merely as a retail entity but as an infrastructural power: an actor in the economic and affective governance of everyday life. It exploits workers, disciplines suppliers, extracts data from consumers, and dominates regional markets—all under the moral guise of convenience and efficiency. Within this apparatus, surveillance partnerships such as that with Palantir do not represent a new aberration but an intensification of existing corporate logics: the technical completion of exploitation that has long defined Australia’s duopolistic grocery regime.

Through Palantir’s Foundry platform, these practices acquire new scale and precision. The system integrates billions of data points each day, consolidating them into a real-time apparatus for behavioural monitoring and so-called “operational optimisation.” Combined with AI-enabled tracking, biometric cameras, and predictive analytics (ABC News 2024; 7 News 2024), the workplace becomes an AI panopticon—an infrastructure of visibility where discipline and profitability converge. As Mark Coeckelbergh observes, “AI can be understood as contributing to all kinds of less visible panopticons… Current governance by data, or ‘algorithmic governance,’ leads to what Foucault (1977) called a ‘disciplinary society’ that works analogously to Bentham’s design for the panopticon, but now pervading all aspects of social life: the effects of disciplinary power are ‘not exercised from a single vantage point, but are mobile, multivalent and internal to the very fabric of our everyday life’” (Coeckelbergh 2022: 108–109). Palantir’s systems operationalise this diffusion of surveillance power: control is no longer centralised but distributed through code and data integration. What was once the architectural gaze of the panopticon has become algorithmic infrastructure—power embedded in the everyday circulation of information, invisible yet absolute.

The Coles–Palantir partnership embodies this post-Foucauldian panopticism: an infrastructural regime extending through logistics systems, data collection, and predictive analytics. “The perfect disciplinary apparatus would make it possible for a single gaze to see everything constantly… a perfect eye that nothing would escape” (Foucault 1975: 173–174). Within Coles’ algorithmic workplace, this gaze is realised through automated supervision—“hierarchized, continuous and functional surveillance… both absolutely indiscreet, since it is everywhere and always alert, and absolutely discreet, for it functions permanently and largely in silence” (Foucault 1975: 176).

Here, the disciplinary gaze becomes non-human. What Foucault described as the internalisation of supervision now operates through algorithmic abstraction—what Coeckelbergh calls “invisible chains and ever-watching non-human eyes.” “This negative freedom is at stake when AI technology is used for surveillance to keep people in a state of enslavement or exploitation. The technology creates invisible chains and ever-watching non-human eyes… Knowing that you are being watched all the time, or could be watched all the time, is enough to discipline you” (Coeckelbergh 2022: 113). In Coles’ data-driven management, this condition finds its infrastructural form: a labour regime governed by the anticipation of its own visibility. The machine oversees, records, and calibrates; docility becomes an operational outcome.

This is the machinery of power rendered banal through corporate language—efficiency, optimisation, logistics. It is the architecture of surveillance embedded within everyday governance, producing continuous visibility and administrative control. The normalisation of AI-accelerated “background” surveillance represents what Foucault described as “innumerable petty mechanisms” of control: unremarkable, automated, and ceaseless. “This infinitely scrupulous concern with surveillance is expressed in the architecture by innumerable petty mechanisms… minor but flawless instrumentation in the progressive objectification and partitioning of individual behaviour” (Foucault 1975: 173). “The disciplinary institutions secreted a machinery of control that functioned like a microscope of conduct” (Foucault 1975: 173). In the Coles–Palantir apparatus, this microscopic gaze expands to an industrial scale—a seamless fusion of profit, discipline, and visibility.

Coles’ partnership with Palantir crystallises this microscopy of conduct: the transformation of retail labour into dataset. Surveillance becomes ambient, seemingly harmless, but it perfects the exploitation of labour through algorithmic quantification. What appears as neutral optimisation conceals an intensified microphysics of power—a silent, perpetual calibration of bodies, time, and behaviour in service of profit.

Where earlier regimes of control relied on inspectors and discrete audits, the digital regime enacts continuous and internalised supervision—power embedded within the production process itself. “It was different from the one practised in the regimes of the manufactories… what was now needed was an intense, continuous supervision; it ran right through the labour process” (Foucault 1975: 175). “Surveillance thus becomes a decisive economic operator both as an internal part of the production machinery and as a specific mechanism in the disciplinary power” (Foucault 1975: 176).

“Everyone will be fully analysed and accounted for. Their every action monitored, their every preference known, their entire life calculated and made predictable” (Bloom 2019, cited in Coeckelbergh 2022: 113). Within such architectures, life itself becomes a data object, reduced to the variables of efficiency and profit. In this sense, Palantir’s algorithmic gaze is not merely disciplinary—it is ontological. It does not simply watch behaviour; it defines it in advance, producing subjects whose actions are already anticipated by code—“in which AI knows us better than—and before—we do” (Coeckelbergh 2022: 113–114).

The consequence of this total visibility is affective as much as structural. Continuous surveillance generates a social atmosphere of uncertainty and self-policing—an emotional infrastructure that binds people to their own subjection. Fear becomes the affective residue of algorithmic governance: the feeling that one is always watched, always measurable, always potentially in violation.

Sara Ahmed’s The Cultural Politics of Emotion (2004) describes fear as an affect that circulates between bodies, institutions, and discourses—binding subjects together through shared anxieties and imagined threats. Fear is not merely felt; it is mobilised. It organises publics around danger, produces “others” to be contained, and legitimises regimes of surveillance and control. Within contemporary Australia, this affective economy of fear has become a structural technology of governance: a circulatory system through which emotion becomes policy, and policy becomes profit.

Nowhere is this more visible than in the symbiosis between media, police, politicians, and corporate interests. Each element of this assemblage amplifies and feeds off the other, forming what might be called an affective–industrial complex. The Victoria Police Union’s recent campaign for harsher sentencing “amid record crime” (ABC News 2025) exemplifies this dynamic. Sensationalist coverage by commercial outlets—headlines warning of “rising crime” in Melbourne or a “wave of lawlessness” sweeping retail spaces (9News 2025)—produces the emotional climate in which state and corporate surveillance appear not only reasonable but necessary. Fear legitimises both policing and profit.

Retailers have capitalised on this atmosphere. Industry groups have lobbied for “urgent action to combat retail crime” (RetailBiz 2025), positioning theft as an existential threat to business rather than a symptom of systemic inequality. Coles, for instance, has amplified this narrative, claiming a “crime surge” in key states (News.com.au 2025) while simultaneously rolling out new surveillance infrastructures under the guise of worker and customer safety. The corporation’s rhetoric aligns perfectly with the state’s securitisation discourse: fear of theft becomes a moral and affective justification for intensified monitoring.

Yet this “retail crime wave” is less a crisis of morality than a crisis of survival. The cost-of-living emergency—manufactured and sustained by the same duopoly that dominates Australia’s food economy—has driven many to desperation. In this environment, theft emerges as both an act of necessity and an index of systemic collapse. Rather than addressing the exploitative pricing structures that render food unaffordable, Coles reframes the outcome as criminality, using the very conditions it helped produce to legitimise expanded surveillance and securitisation.

The cycle is self-perpetuating: economic exploitation breeds scarcity; scarcity breeds theft; theft breeds surveillance; surveillance breeds fear; and fear reinforces the authority of those who profit from it. This affective feedback loop—what Ahmed describes as the “stickiness” of fear—binds the population to its own subjugation. The public is enlisted emotionally into the expansion of power: shoppers and workers alike internalise the rhetoric of danger, consenting to more cameras, more monitoring, more control.

Fear, in this sense, is not simply psychological—it is infrastructural. It lubricates the machinery of extraction, translating emotion into compliance and compliance into data. The affective politics of fear thus functions as capital’s emotional infrastructure: it prepares the ground for economic domination by securing consent to the technologies that enforce it.

For both Foucault and Marx, surveillance operates not only as a political mechanism but as an economic one. Control becomes a productive force, converting supervision into efficiency and “surplus power” into surplus production. “The panoptic mechanism… establishes a direct proportion between ‘surplus power’ and ‘surplus production’” (Foucault 1975: 206). “Only agents, directly dependent on the owner… would be able to see that not a sou is spent uselessly, that not a moment of the day is lost” (Foucault 1975: 177). “Surveillance thus becomes a decisive economic operator… the work of directing, superintending and adjusting becomes one of the functions of capital” (Foucault 1975: 177).

Palantir realises this economy of the gaze at planetary scale: surveillance as optimisation, visibility as profit. The Coles–Palantir system converts the emotional climate of fear into quantifiable data, transforming human conduct into operational information. Labour discipline and consumer behaviour become forms of value production—every gesture recorded, every action rendered exploitable. In this fusion of affect, economics, and code, fear is no longer merely an emotion but an investment strategy: the securitisation of everyday life as a frontier of accumulation.

Zuboff’s concept of “Big Other” names this evolution of power from supervision to subsumption: “rather than centralised state control, we face a ubiquitous networked institutional regime that records, modifies, and commodifies everyday experience... establishing new pathways to monetisation and profit” (Zuboff 2015, cited in Coeckelbergh 2022: 113–114). What begins as the digitalisation of labour extends to the datafication of life itself. Each gesture, transaction, and utterance becomes a unit of behavioural surplus, absorbed into circuits of prediction and extraction. This totalising visibility erases the boundaries between management, governance, and surveillance, rendering control indistinguishable from participation.

Shoshana Zuboff identifies the rise of instrumentarian power—a regime that governs behaviour through predictive data and automated feedback. “The rise of instrumentarian power is intended as a bloodless coup… We are expected to cede our authority, relax our concerns, quiet our voices” (Zuboff 2019: 381–382). The Coles–Palantir system enacts this coup within the sphere of labour: control is no longer enforced through direct coercion but through algorithmic calibration. Workers submit not under threat of violence but through the daily rituals of efficiency, compliance, and data-driven optimisation.

Coercion gives way to calibration. “It is a future of certainty accomplished without violence. The price we pay is not with our bodies but with our freedom” (Zuboff 2019: 395). Yet this control is not purely technical—it is economic and affective. As Gilles Deleuze writes, “man is no longer a man confined but a man in debt” (Deleuze 1992: 6). Debt becomes the moral and affective infrastructure of control: the worker living paycheck to paycheck, the consumer bound by the cost-of-living crisis, the citizen owing data for access. In Palantir’s surveillance regime, participation itself becomes a form of payment. This extends Foucault’s panopticism into biopolitical indebtedness, where survival—economic or informational—requires perpetual submission to systems of monitoring and extraction.

Lauren Berlant names the affective atmosphere of this condition “crisis ordinariness,” a life “truncated, more like desperate doggy paddling than like a magnificent swim out to the horizon” (Berlant 2011: 117). Under late techno-capitalism, exhaustion becomes structural. The cruelty of this arrangement lies not in spectacular punishment but in the normalisation of precarity: the feeling of being overworked, indebted, and watched all at once. Deleuze gives its structure, Berlant its texture: together they describe an infrastructure of exhaustion—survival as subjection, life as ongoing crisis. Palantir’s analytics realise this transformation, producing a form of instrumentarian pacification: governance through optimisation that neutralises resistance by appearing technical, rational, and inevitable.

Nick Couldry and Ulises Mejias provide the structural political economy of this transformation: surveillance as the colonial appropriation of life itself—the extraction of human experience as data. “Data colonialism combines the predatory extractive practices of historical colonialism with the abstract quantification methods of computing” (Couldry & Mejias 2018: 1). What was once the conquest of land and labour is now the annexation of thought, movement, and relation to capital’s digital infrastructures. “The colonial appropriation of life in general and its annexation to capital… through the digital platform” (Couldry & Mejias 2018: 4) reveals how the subject becomes the frontier of accumulation.

Palantir’s model of total data integration reproduces this colonial logic: life rendered as resource. What Zuboff describes as behavioural surplus becomes, in Couldry and Mejias’ terms, colonial extraction—the seizure of human action as capital’s raw material. Each movement, transaction, and decision is captured, classified, and commodified. “A continuously trackable life is a dispossessed life” (Couldry & Mejias 2018: 16).

This dispossession extends beyond the sphere of labour into the geopolitical domain. Palantir’s predictive systems govern both the deportation of migrants in the United States and the targeting of Palestinians under Israeli occupation. In both contexts, the same architectures of computation—pattern recognition, behavioural prediction, spatial mapping—translate life into data, data into risk, and risk into elimination. The algorithmic identification of “threats” within ICE’s Investigative Case Management system or the IDF’s “kill chain” represents the same operational logic: the reduction of subjectivity to information, the transformation of visibility into vulnerability. Through Coles, Palantir’s technologies of domination are normalised within the infrastructures of everyday life, embedding the colonial architectures of war and carcerality into the retail floor.

Coeckelbergh situates this process within a longer history of racial and colonial domination. “The present injustices done to, say, black people in the US are not only rooted in racism… but are also the continuation of these concrete and shockingly wrong historical forms of oppression and racist practices, albeit in forms that are not formally recognised as slavery and colonialism” (Coeckelbergh 2022: 78). The algorithmic regime, he argues, extends these structures into new, less visible forms: “People’s data, labour, and, in the end, social relations are appropriated by capitalism” (Coeckelbergh 2022: 79).

Against this background, data colonialism is not metaphor but continuity. The infrastructures that mine and monetise experience replicate the historical logics of empire, transposed into code. “The AI invasion of Africa echoes colonial-era exploitation,” as Abeba Birhane notes—“the neglect of local needs and the perpetuation of historical bias amounting to the algorithmic colonisation of the continent” (Birhane 2020, cited in Coeckelbergh 2022: 79). Palantir’s extractive systems participate in this global redistribution of power: data harvested in one place, by the exploited, serves the interests of the privileged elsewhere.

Coles operates on stolen land, profiting from an economy built on dispossession, surveillance, and extraction. Its techno-capitalist infrastructure extends the settler project into the digital age—transforming colonial occupation into data form. Its stores and supply chains mark the convergence of logistical capitalism and racial governance: a duopoly that monopolises food access while economically punishing remote First Nations communities, where “Indigenous Australians pay more than double capital-city prices for everyday groceries” (The Guardian 2024; News.com.au 2024).

This structural inequity was laid bare in the death of Kumanjayi White, a 24-year-old Warlpiri man who died after being restrained by two plain-clothed police officers inside a Coles supermarket in Mparntwe/Alice Springs (ABC News 2025; SBS NITV 2025). Despite public outcry and repeated requests from his family, neither Coles nor the Northern Territory Police have released the CCTV footage of the incident. The refusal to disclose this evidence reveals the ethical bankruptcy of Australia’s surveillance regime: an extensive corporate–state apparatus that records everything yet withholds accountability when lives are taken. The same surveillance networks marketed as instruments of “safety” and “efficiency” fall silent when justice demands their use.

In Alice Springs—a town long subjected to racialised policing and economic extraction—police presence inside Coles functions as the protection of capital, not community. Their role is to defend property from the very populations structurally condemned to precarity: First Nations people, the poor, the unemployed. This is the colonial logic of security in its most distilled form—violence deployed to preserve profitability.

The inquest into Kumanjayi’s death, still pending at the time of writing, exposes the entanglement of corporate infrastructure, police power, and colonial governance. Coles is expected to appear before the Northern Territory Coroner alongside NT Police, Talice Security, and NT Health. Kumanjayi, who was living with disabilities, died on stolen land, inside a supermarket owned by a corporation that profits from the dispossession of First Nations peoples. His grandfather, Warlpiri Elder Ned Jampijinpa Hargraves, told media outside court, “We have the right to know—but you’re not showing that to us, you’re not telling us, and we cannot trust you with anything” (ABC News 2025).

Coles’ surveillance infrastructure, vast and invasive, failed to deliver even the most basic form of accountability. In partnership with Palantir, the company extends this continuum into algorithmic form—tracking, profiling, and managing life as data. What appears as optimisation is a digital re-enactment of settler control: a techno-colonial governance that extracts both labour and visibility from those already dispossessed.

Coles’ complicity cannot be separated from the longer history of surveillance and control over Aboriginal life—from the ration station to the welfare card to biometric identification. In partnership with Palantir, the company extends this continuum into algorithmic form: tracking, profiling, and managing life as data. What appears as optimisation is a digital re-enactment of settler control—a techno-colonial governance that extracts both labour and visibility from those already dispossessed.

Through this lens, the Coles–Palantir alliance exemplifies what Mark Coeckelbergh calls “neo-colonialism by computation”—the fusion of surveillance capitalism with the residues of empire. Infrastructural fascism names this synthesis: a regime in which surveillance, extraction, and governance converge within the same technical substrate. What appears as logistics or optimisation conceals an imperial order of computation—an architecture through which capital governs not only production but perception itself.

On stolen land, this is not metaphor but material fact. The Coles–Palantir alliance exposes the new age of empire: power miniaturised into within systems of retail, perfected in the ordinary, and rendered invisible by its own efficiency. Yet the stakes extend far beyond the supermarket. The same infrastructures that process billions of data points from Australian workers and consumers are already weaponised elsewhere for genocide, racial profiling, and war. They underpin the targeting of Palestinians, the coordination of military violence, and the algorithmic administration of ethnic cleansing—defining the worst atrocities of the twenty-first century.

The normalisation of infrastructural fascism is therefore not a neutral advance in logistics but the domestic absorption of a global machinery of violence. To feed this panoptical system with the data of everyday life is to participate in its reproduction. It must be resisted—not only for the protection of Coles’ workers against exploitation, but for the defence of life and freedom everywhere.


Notes

Achille Mbembe. Necropolitics. Trans. Steven Corcoran. Duke University Press, 2019.

Ahmed, Sara. The Cultural Politics of Emotion. Routledge, 2004.

Australian Competition and Consumer Commission (ACCC). Supermarkets Inquiry Summary. ACCC, Feb 2025.

Berlant, Lauren. Cruel Optimism. Duke University Press, 2011.

Business & Human Rights Resource Centre. “Palantir Allegedly Enables Israel’s AI Targeting amid Israel’s War in Gaza, Raising Concerns over War Crimes.” 2024.

Coeckelbergh, Mark. The Political Philosophy of AI. Polity Press, 2022.

Couldry, Nick & Mejias, Ulises A. “Data Colonialism: Rethinking Big Data’s Relation to Colonialism.” Television & New Media 20 (4), 2018: 336–349.

Deleuze, Gilles. “Postscript on the Societies of Control.” October 59 (1992): 3–7.

Foucault, Michel. Discipline and Punish: The Birth of the Prison. Trans. Alan Sheridan. Vintage, 1995.

Iliadis, Andrew & Acker, Amelia. “The Seer and the Seen: Surveying Palantir’s Surveillance Platform.” The Information Society 37 (5) 2021

Mijente. Who’s Behind ICE? The Tech and Data Companies Fueling Deportations. Mijente, 2019.

Nick Couldry & Ulises A. Mejias. “Data Colonialism and the Colonial Appropriation of Life.” Television & New Media20 (4), 2018.

Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books, 2019.