Gramsci’s nightmare ⊗ A small slice of human knowledge
No.383 —The future of rules-based international order ⊗ The Agentic AI Foundation ⊗ Protecting the Ecuadorian Amazon ⊗ Iceland’s flowing textures
Gramsci’s nightmare: AI, platform power and the automation of cultural hegemony
Fantastic piece by Ethan Zuckerman, where he argues that large language models have automated cultural hegemony in ways Antonio Gramsci never anticipated. LLMs encode the biases and worldviews of their training data—primarily content from Wikipedia, blogs, and forums created by WEIRD (Western, Educated, Industrialized, Rich, Democratic) populations. That we know, and it’s been mentioned here numerous times, but he talks about more than bias in the date, here he explains bias emerging from the absence of other languages, perspective, and cultures. It’s a written version of a talk he gave, and also includes some very clear and useful slides to better understand how LLMs work.
These systems don’t just reflect existing cultural dominance; they amplify it. Gender stereotypes, racial biases, and Western values become embedded in mathematical relationships that shape how millions of people access information. The problem compounds as AI-generated content trains the next generation of models, creating a feedback loop that calcifies hegemonic values into technical infrastructure. Content moderation systems that disproportionately filter out African American Vernacular English and queer voices, combined with the underrepresentation of non-Western languages and perspectives, mean these tools systematically erase alternative worldviews rather than simply censoring them.
Zuckerman sees potential escape routes. The Te Hiku Media project in Aotearoa New Zealand built a Maori language model using carefully collected audio from elders, creating tools that serve the community rather than extracting from it. He imagines a future where community-curated LLMs, built around Maori, Malagasy, Indonesian, and other marginalized knowledge systems, exist in dialogue with dominant models. This approach might produce less brittle, more creative AI than systems trained exclusively on WEIRD data. The concept mirrors Gramsci’s call for “organic intellectuals” who advance working-class values rather than unconsciously replicating elite perspectives. Valuing diverse cultural data and supporting communities in building their own AI systems offers a path away from a future where algorithmic systems permanently lock existing power structures into place.
More → See also Abundant Intelligences’ pods, one of which is in Aotearoa New Zealand and does work in line with Te Hiku Media’s.
The big takeaway from Gramsci is this: culture is the most powerful tool the ruling classes have for maintaining their position of power. Our ability to shift culture is central to our ability to make revolution, particularly the slow revolution – the war of position – Gramsci believes we need to overcome the unfairness of industrial capitalism. […]
Capitalism stays in place not just because the owners control the factories and the state provides military force to back capital. It stays in place because of cultural hegemony. Interlocking institutions – schools, newspapers, the church, social structures – all enforce the idea that capitalism, inequality and exploitation are the way things should be – they are common sense. […]
Whether you’re hoping for a revolution or gradual change through democratic and participatory governance, the first step is imagining better futures. Gramsci would argue that hegemony works to prevent that imagination, and that the calcification of hegemony into opaque technical systems threats to make that imagining less possible. […]
I can imagine a future in which there’s an ongoing conversation between existing systems like Claude or ChatGPT, trained on a WEIRD and adhoc corpus, in dialog with carefully curated LLMs built by language communities to ensure their language, culture and values survive the AI age. […]
Valuing this data is the first step to escaping Gramsci’s nightmare. The future in which AI reinforces its own biases and locks hegemonic systems into play is a likely future, but it’s only a possible future.
A small slice of human knowledge
This Aeon essay by Deepak Varuvel Dennison shares Zuckerman’s concern about LLMs encoding Western hegemony, but approaches the problem through concrete examples of vanishing expertise rather than systemic analysis. I don’t think I’ve ever shared together two pieces with as much overlap, but past the intro of this one, you really see the different approaches and each supplements the other very well.
While Zuckerman examines the technical feedback loops and training data composition that calcify hegemonic values, Dennison focuses on what’s being lost: artisans in India who know how to produce biopolymers from local plants, water managers in Bengaluru whose oral knowledge remains unwritten, and the last craftsperson who knew how to make a specific limestone-based brick. He emphasizes that 97 percent of the world’s languages are classified as “low-resource” in computing contexts, despite many having millions of speakers and centuries of linguistic heritage. This isn’t abstract cultural dominance—it’s the physical erasure of embodied knowledge that shapes how communities interact with their environments.
Dennison also draws on forest ecologist Peter Wohlleben’s insight that ecosystem health depends on the presence of all its parts, even those that seem inconsequential, applying this principle to human knowledge systems. Local knowledge isn’t simply underrepresented, its absence disrupts the larger web of understanding that sustains human and ecological wellbeing. The problem isn’t just that LLMs reflect existing power structures, they put the marginalization of Indigenous and local knowledge “on steroids,” hardening centuries-old patterns of dismissal into technical infrastructure that becomes harder to challenge with each training cycle.
Examining several GenAI deployments built for non-Western populations. Observing how these AI models often miss cultural contexts, overlook local knowledge, and frequently misalign with their target community has brought home to me just how much they encode existing biases and exclude marginalised knowledge. […]
It should not come as a surprise that a growing body of studies shows how LLMs predominantly reflect Western cultural values and epistemologies. They overrepresent certain dominant groups in their outputs, reinforce and amplify the biases held by these groups, and are more factually accurate on topics associated with North America and Europe. […]
Can we move towards this technological future while authentically engaging with the knowledge systems we’ve dismissed, with genuine curiosity beyond tokenism? Or will we keep erasing forms of understanding through the hierarchies we’ve built.
Futures, Fictions & Fabulations
- The future of rules-based international order. “Why do the foundations of global order feel less certain now than they once did? Norms that once held states in check are contested, alternative forms of alliances are growing more assertive, rapid technological change continues to alter the balance of influence and global economic ties are realigning under new pressures. What might these shifts signal about the next configuration of world order?”
- Scenarios: Crafting and using stories of the future to change the present. (Book) “Leaders need well-developed foresight because all big decisions are influenced by their story of the future, whether they are aware of it or not. The “official story of the future” is a more or less coherent, more or less conscious, more or less shared narrative about what will happen in 3 months, 6 months, a year, or five years.”
- What is fiction’s role in imagining better social policies?. “Speculative fiction can be a full-service ‘laboratory of the mind’—as useful for imagining alternate social, political, and community structures as it is new gadgets and their time warps. Social science gives us a lens to understand whether speculated futures are aspirational or ominous, and to determine the values and visions we want to prioritize.”
Algorithms, Automations & Augmentations
- OpenAI, Anthropic, and Block have cofounded a new open source organization, the Agentic AI Foundation. “Anthropic is donating the Model Context Protocol to the Linux Foundation’s new Agentic AI Foundation, where it will join goose by Block and AGENTS.md by OpenAI as founding projects.”
- AI slop is spurring record requests for imaginary Journals. “AI models not only point some users to false sources but also cause problems for researchers and librarians, who end up wasting their time looking for requested nonexistent records.” And: Librarians are tired of being accused of hiding secret books that were made up by AI (no offence, but if you’re arguing with a librarian because an LLM said so, you’re doing knowledge/learning/facts wrong).
- AlphaFold’s protein database has been used by three million researchers in five years. “AlphaFold has been cited in more than 35,000 scientific articles. Over 200,000 articles have used elements of AlphaFold 2 in their methodology. It has contributed to understanding heart disease, conserving bee colonies, and developing more resilient crops.”
Built, Biosphere & Breakthroughs
- Indigenous Shuar women are protecting the Ecuadorian Amazon. “Faced with the advance of climate change and deforestation, members of the Shuar community of San Luis Ininkis in Morona Santiago province have decided to sow life: they are reforesting with native species, protecting water sources and passing on their ancestral wisdom to new generations.”
- Union Terrace Gardens in Aberdeen named Scotland’s best new building. “Named Union Terrace Gardens, the project was praised by the Royal Incorporation of Architects in Scotland (RIAS) for transforming the neglected Victorian garden in Aberdeen into a "safe, people-centred and accessible public realm" for the city.”
- Coffee biochar makes for lower-carbon, stronger concrete. “Researchers in Australia have been finding ways to use coffee grounds to make concrete. And now, an in-depth analysis shows that concrete made with the organic waste is not only 30% stronger, it also reduces carbon dioxide emissions by 26%.”
Asides
- “Currents of Solitude” showcase Iceland’s flowing textures. “Photographer Jan Erik Waider returns with a hauntingly beautiful exploration of Iceland’s southern coast. His new project reveals sweeping aerial perspectives that capture the quiet poetry of glacial rivers as they wind across black volcanic sand. Delicate yet distinct ribbons of yellow and blue trace their paths through the landscape, creating an abstract and painterly effect.”
- China smashes drone display world record - nearly 16,000 drones take to the sky in incredible display. “The performance earned dual Guinness World Records for drone coordination achievements. China replaced fireworks’ chemical combustion with digitally programmed light choreography.”
- Photographing the Andromeda Galaxy for 10 Seconds vs 10 Hours. “This video by Ian Lauer is an excellent accessible explanation of the basics of astrophotography as he runs through the process of how he captures a long-exposure image of the Andromeda galaxy.”