AI colonialism is a phrase that sounds futuristic, but it’s already here. It describes the way artificial intelligence carries forward the old patterns dressed in digital clothes. For those of us old enough to remember stories of empires and colonisation, the resemblance can be unsettling.

Observers such as Meredith Whittaker, President of the Signal Foundation and co-founder of the AI Now Institute, argue that AI is not neutral. It is built on surveillance business practices, designed to extract data, concentrate profit, and shape behaviour. In short, it is power, algorithmically enforced.

The comparisons to history are stark.

Where colonisers once claimed land and resources, AI now claims data. Where labour was extracted under harsh conditions, today it is the invisible gig workers in the Global South, tagging images or moderating content for a few cents while wealth pools in Silicon Valley.

And what of knowledge itself? AI systems carry Western values and assumptions as if they were universal truths. Local cultures, diverse ways of knowing, and alternative perspectives get pushed aside. It is a digital monoculture, disguised as progress. Even the environment pays the price. AI demands energy and resources at enormous scale. The environmental costs are rarely borne by the companies who profit. They fall instead on already vulnerable communities, the same pattern we’ve seen before, played out in a new register.

Whittaker has been blunt in her assessment of the industry: AI is rooted in the values of surveillance and profit, and unless challenged it will reproduce inequality at scale.

For those of us over 50, this isn’t the first time we’ve seen new systems sold as progress while quietly entrenching old hierarchies. Colonialism left scars that lasted generations. AI colonialism threatens to do the same through data, algorithms, and the very definitions of what counts as our human knowledge. The challenge, then, is to imagine something different.

To “decolonise AI” means dismantling barriers, listening to voices outside the usual centres of power, and demanding technology that serves communities instead of exploiting them.

History tells us what happens if we don’t. But history also reminds us of our capacity to adapt, to reform, to insist on better.

AI itself is not the enemy. Like every great tool, it reflects the values we choose to build into it. If those values are fairness, inclusion, and care, then AI can be part of a future that strengthens, rather than undermines, our humanity.

For those of us who have lived long enough to see the world remade many times over, the lesson is clear. We don’t need to fear the machine. We need to hold the people shaping it accountable and insist that progress honours everyone, not just a few.

The future of AI, like every chapter of history, is still being written. And this time, we all deserve a say.


Discover more from Charlie-Helen Robinson

Subscribe to get the latest posts sent to your email.

Trending

Discover more from Charlie-Helen Robinson

Subscribe now to keep reading and get access to the full archive.

Continue reading