Still from the film ‘ìfé,’ used with permission.
By Pamela Adie
This post is part of Global Voices’ April 2026 Spotlight series, “Human perspectives on AI.” This series will offer insight into how AI is being used in global majority countries, how its use and implementation are affecting individual communities, what this AI experiment might mean for future generations, and more. You can support this coverage by donating here.
In 2020, not long after I released “ìfé,” a film about two women falling in love, the head of Nigeria’s National Film and Video Censors Board went on international news and said that my team and I could be arrested.
It wasn’t a vague warning. It was direct and public, delivered on CNN. For many queer filmmakers in Nigeria, it confirmed something we already knew: telling our stories can come with real consequences.
But there is another consequence that is less visible. Beyond censorship and risk, there is the question of what happens when these stories never enter the systems that increasingly shape how knowledge is produced.
As artificial intelligence (AI) becomes more embedded in how information is organised and retrieved, visibility is no longer just about the audience; it is about data. The stories that are publicly available and widely circulated are the ones most likely to be captured in the datasets that train AI systems. What remains hidden or fragmented is far less likely to be included.
In contexts like Nigeria, where queer storytelling already exists under constraint, this creates another layer of exclusion. These stories are not only difficult to tell; they are also less likely to be reflected in the systems that will shape how queer life is understood in the future.
AI systems do not distinguish between what is absent and what has been actively suppressed. They learn from available data and reproduce those patterns at scale. When queer stories are missing or underrepresented, that absence becomes part of the record.
So the question is no longer only about visibility in the present. It is about whether these stories will be legible to the technologies that are already shaping cultural memory.
Sometimes, the risks are immediate. In Lagos, police raided the workspace of Olutimileyin Kayode, the organiser of Pride in Lagos, after a report was filed. Not long after, he lost the space; the property owner decided it was too risky to keep them there. When spaces like these disappear, so do the records they might have produced — records that would otherwise contribute to how queer life is documented and, increasingly, how it is learned by machine systems.
Much of this work does not move through formal distribution channels. It circulates through private screenings, password-protected links, film festivals, and word of mouth. Some films are shown once and never released publicly. Others remain incomplete due to limited funding or are deliberately kept small to avoid attention.
These are necessary strategies for survival. But they also mean the work remains largely invisible to the data infrastructures that feed AI systems.
Distribution itself becomes a negotiation. Filmmaker Chinazaekpere Chukwu took a strategic route with her film “Ti E Nbo,” starting with international festivals. After screening at the African International Film Festival in 2023, it was rejected by other Nigerian festivals. It later reached audiences through a streaming partnership in Ghana and gained international recognition. Only then did Nigerian festivals begin to show interest.
Her experience reflects a broader pattern: local gatekeepers often respond to international validation before offering visibility at home. For filmmakers whose work does not travel globally, the path becomes even narrower, regardless of how important those stories are.
Sometimes, the most effective interventions are quiet. The film “ìfé” portrayed love between two Nigerian women in an ordinary, everyday way — without spectacle or explanation. It did not argue for acceptance; it simply existed. Audiences responded with a kind of recognition rarely seen in local media. The film gained traction online and in the press, offering a rare moment where queer love was presented as part of everyday life.
In my work with The Equality Hub, we tried to build on this through EhTv Network, a streaming platform for queer African stories outside state control. It was short-lived because of funding constraints, but it highlighted both the urgency and the difficulty of. building independent platforms for these narratives. We are now reimagining it as an archive and discovery platform.
This kind of work matters because AI systems are trained on large volumes of publicly available, digitised content. In theory, this creates a broad representation of the world. In practice, it reflects existing inequalities in what gets seen and preserved.
For queer storytellers in Nigeria, many narratives never reach that level of visibility. They exist in private archives, limited screenings, or fragmented forms. What enters the public domain is often only a fraction of what exists.
When AI systems are trained on incomplete records, they do not just reflect those gaps — they reinforce them. A future user asking about queer life in Nigeria may encounter a version of reality that is partial or missing key perspectives, not because these stories do not exist, but because they were never widely captured. In this way, existing forms of marginalisation are carried forward into new systems. What is missing from the dataset becomes missing from the narrative.
Storytelling, then, takes on another layer of importance. In Nigeria, it has long been a way to resist social erasure. In the context of AI, it also becomes a way to ensure these stories are documented and accessible in ways that can shape future knowledge.
Seen this way, queer storytelling is not only creative work — it is also infrastructural. Each story contributes to what can be known, by whom, and increasingly, by what systems. The challenge is not only to tell these stories, but to ensure they are preserved and made visible in ways that resist both social and algorithmic erasure.
AI will not recognise the difference between silence and suppression. It will only learn from what is available.
And in a world where so much is filtered, what remains unseen today risks becoming what is remembered tomorrow. So the question is not whether these stories matter.
It is whether they will be visible enough to shape what the future remembers.




