Image courtesy of Sydney Allen via Canva.
This post is part of Global Voices’ April 2026 Spotlight series, “Human perspectives on AI.” This series will offer insight into how AI is being used in global majority countries, how its use and implementation are affecting individual communities, what this AI experiment might mean for future generations, and more. You can support this coverage by donating here.
A famous aphorism attributed to science-fiction writer William Gibson observes that “the future is already here, it’s just not evenly distributed.”
This is what our Spotlights seek to track: the uneven distribution of technology, trends, attitudes, and attention around our world. Because it’s not just the future: so many things in our world depend on where you are, and where you’re from: the latest technology; affordable health care; high-speed trains; international mobility; community. And it’s hard to see that unevenness past your own bubble, depending on the language you’re getting your news in, and who you talk to, or follow, or scroll past.
As a science-fiction writer myself, I don’t equate so-called AI – the subject of our first spotlight – with the future. From my particular perspective as a concerned author who talks to a lot of authors, and an academic who interacts with a lot of academics, I know I have a particular perspective on that technology. I read a lot about the way the technology is being used in publishing, from attacks on copyright and intellectual ownership to the propagation of specific language styles. I was heartened, though not very surprised, to read the article about Australian artists and culture workers resisting AI.
But I know that I come at this from the standpoint of a native English speaker, for whom writing is relatively easy, and my own voice is an important part of my career. I also know there are other uses for the technology, which I’ve paid less attention to — many of them at least as damaging in their respective sectors.
From this Spotlight, I’ve learned more about the way AI is being used in dangerous and exclusionary ways at the US border, as well as an attempt to use it for inclusion and facilitation at the Italian border. I’ve read about the way AI is contributing to a culture of surveillance in India, and the initial hype and then cost-related disappointment around agentic AI in China. Through interviews and essays, I’ve learned more about the potential and risks of AI in language preservation efforts for minority and Indigenous languages — a question that we’ve done a lot of thinking about at Global Voices in connection with our digital inclusion work.
One thing about technologies being unevenly distributed is that it’s not just the access that is uneven. The value chains and by-products of modern life are spread across the globe, with goods often manufactured in one part of the world, used and discarded in another, and those unwanted remnants shipped as waste to be interred in the landfills of a poorer country.
This Spotlight, with its global, multilingual burst of stories on the topic, reminded me that even a virtual technology like AI can have a similar pattern. Enormously valued companies and their profits are concentrated in rich countries, even if the excitement has spread globally. But those products, launched with so much fanfare from their wealthy CEOs, are supported by low-paid moderation, often in a multitude of languages. I’ve read a lot lately about how communities in the United States are fighting against the establishment of data centers; two collaborative articles in the Spotlight make clear how this dirty, loud, wasteful, environmentally harmful part of the process is also being exported to other countries — many of them low-income — and how people are resisting there too. In Latin America, some countries are racing to accelerate approvals, while communities are pushing back. In Asia, water use is a major concern, and some countries, like Singapore, have already put a moratorium on new data centers, while countries in Central Asia are ramping up — signaling the deeply uneven distribution within the region.
All of these developments are driven not only by massive capital flows, but by stories. They are driven by hype, by deception, by anthropomorphizing, by projections. The Undertones column from this Spotlight uses approaches developed through our Civic Media Observatory methodology to unpack the narratives used to promote the insertion of AI into the healthcare system in El Salvador, thereby illuminating narratives that are often used to promote AI in other contexts as well.
The broad range of uses of large language models (LLMs) and so-called AI technologies has led to important conversations about what jobs, tasks, and leisure activities can or should be mechanized, and the importance of human intervention. Through a collaboration with the Association for Progressive Communication (APC) and GenderIT, we commissioned essays to consider what our obsession with AI says about humanity, in the series “Don’t ask AI, ask a peer.” Topics range from questioning whether AI can be ethical and feminist; examining the intersection of AI and Big Agro in Brazil; and a human rights perspective, among many others.
I’ve only skimmed the surface of the articles in our April Spotlight — there are many more excellent ones. Which one speaks to you very much depends on where you’re coming from, but I’m confident there will be some angle in the series that you’re not already aware of. Just to pick just one relatively cheerful example, I had no idea that a Bosnian rock group had written a total bop satirizing dependence on AI — much less the historical context relevant to the music video — or that it would be exactly what I needed to sing along with to let off some AI-related frustration (try it!).
Access to all these stories, events, and issues I didn’t know about before has also highlighted another good that is unevenly distributed in our world: information. We all have access to enormous amounts of coverage about a small subset of people and topics in a few countries, and very little about anything outside that limited subset. Even with the amazing tool that is the internet, the disproportion in the way attention is allocated can make it very hard to find an audience for what you have to say, and conversely to find the stories you didn’t know you needed — about how AI is affecting the conflict in Colombia; about toolkits for civil society confronting AI; about the dismantling of the Russian internet, and more. With Global Voices, and particularly with these Spotlight series, we hope to tilt that distribution a little closer to even.




