Appropriate measures ⊗ Stop talking about AI ethics, time to talk about power. ⊗ Tech for superheroes and spacecraft

This week → Appropriate measures ⊗ Stop talking about AI ethics. It’s time to talk about power. ⊗ Meet the company designing tech for superheroes and spacecraft ⊗ What is going on here? ⊗ She wants your attention, she’s the voice of the city

A year ago → The most clicked link in issue No.124 was PROTOPIA, A virtual notebook of Future Dreams curated by monika bielskyte on Instagram.

Appropriate measures

At Real Life, Jackie Brown and Philippe Mesly explain that “changing the tech we use is not enough to mitigate the environmental and social harm of mass technology.” Starting from the historical perspective of “appropriate technology” in the 60s, and the works of E.F. Shumacher, Victor Papanek, and Ivan Illich, they show that even as such a movement made a lot of sense in its goals, it was also still centered on technology and maintained an implicit bias of Western superiority. The authors then make a parallel with today’s degrowth movement which, for some, depends on tech while for others on limiting technological development. Both end-up still centered on it in some way. Brown and Mesly conclude that degrowth would do well to learn from Shumacher’s original intent and subsequent lessons: to concentrate instead on the “complex interplay of lifestyle changes, political will, and socioeconomic factors.”

In his best-known, 1973 book, Small is Beautiful: A Study of Economics as if People Mattered, he denounces the technology of mass production as “inherently violent, ecologically damaging, self-defeating in terms of non-renewable resources, and stultifying for the human person.” […]
Schumacher promoted “the technology of production by the masses” — a localist approach that tailored technologies to the needs of the communities they served, with an emphasis on long-term and harder-to-quantify goods like creative expression, skill development, and sustainability. […]
Schumacher makes no mention of why communities in the Global South found themselves in need of such solutions in the first place — namely as a result of the extractive and oppressive forces of colonialism. […]
[W]hen social change is framed primarily in terms of choice of technology, the debate necessarily centers the activity level of a productivist society, not a paradigm shift from growth to wellbeing. […]
Any technology we adopt should be both appropriate to the world as it exists and to the future we desire. In the Global North, the first part of the equation may appear more daunting than the second, given that we have become ever more reliant on technologies that are environmentally, socially, and economically unsustainable. And if previous efforts have been undermined by forces intent on maintaining the status quo, what are our prospects in light of ever greater concentrations of power?

Stop talking about AI ethics. It’s time to talk about power.

Karen Hao with an article and interview with Kate Crawford about her new book, Atlas of AI. Quite interesting because the story of the book and the work that preceded it is basically a sequence of deeper and deeper investigations. Starting from Artificial Intelligence and ethics, Crawford proceeds to data, to the labour that produced it, to the construction of the devices and infrastructure, the resources (and again, labour), and ultimately the power and its concentration into extraordinarily few hands.

I wanted to really open up this understanding of AI as neither artificial nor intelligent. It’s the opposite of artificial. It comes from the most material parts of the Earth’s crust and from human bodies laboring, and from all of the artifacts that we produce and say and photograph every day. Neither is it intelligent. I think there’s this great original sin in the field, where people assumed that computers are somehow like human brains and if we just train them like children, they will slowly grow into these supernatural beings. […]
We’ve spent far too much time focusing on narrow tech fixes for AI systems and always centering technical responses and technical answers. Now we have to contend with the environmental footprint of the systems. We have to contend with the very real forms of labor exploitation that have been happening in the construction of these systems. […]
In that sense, this book is trying to de-center tech and starting to ask bigger questions around: What sort of world do we want to live in?

Meet the company designing tech for superheroes and spacecraft

Fun interview at Maddie Stone’s reliably excellent The Science of Fiction, with Perception’s Jeremy Lasky. Fun for all the great visuals the firm has been making for Marvel movies and the videos showing their work and proofs of concept. Less fun for how blatantly obvious it makes the science - science fiction - fx - tech companies loop and the influence fiction has on which technologies are developed. If we knew that the founders and directors of these companies listened to the social messages and warnings of authors, it would be one thing; since we know they mostly ignore them and just go for the cool gadgets, it’s another. Still, enjoyable look at Perceptions’ work!

We've worked with Apple; we've worked with SpaceX. We've been working with Ford for many years,; we've worked with Mercedes, BMW, Maserati. We've gotten to work in a lot of different industries. And it's taking the inspiration from the science fiction of the films and asking ‘how can we use that to inspire real world technology companies’? […]
A thing that we're always considering [in our film work] is this idea of a 'technological climate'—what is the technological climate of our society now. What is feasible with technology? And knowing that, how do we go just a little bit beyond? If we go too far beyond it, you'll lose everybody, people won't buy it. So it's knowing how far it's okay to push before you start losing people. […]
We love to consult with scientists and engineers. We're always looking for any expert in a particular field that we're going into so we can bounce ideas. If it's not grounded in reality, then it's just a bunch of glowing blue stuff on a screen.

What is going on here?

Second book review in this issue. This one by Hilary Cottam about John Kay and Mervyn King’s Radical Uncertainty is not as detailed as what I usually look for in such posts but still very much worth a share. First, for the useful overview of uncertainty vs risk, and for highlighting two prominent economists arguing for story making and story telling above over-reliance on data and models. Second, for Cottam’s important caveat: there are no women referenced! She easily overdoes the authors with some great recommendations of multiple women who’s work predates the book.

Risk is likened by the authors to a puzzle. It can be solved by existing information ordered in the right way. Uncertainty is like a mystery – we are missing information and in particular we are beyond the limits of statistical reasoning. […]
We are operating in conditions of mystery where our knowledge is imperfect and variables are constantly changing. Climate, economic and social systems are not linear. […]
A reliance on data driven modelling leads large organisations in particular to make decisions ‘on the basis of what is easiest to justify rather than what is the right thing to do’.

She wants your attention, she’s the voice of the city''

Another piece with an historical perceptive and considerations of power dynamics. Naomi Credé looks at the voices of public address systems, service workers, especially feminised labour, gender roles, and of course voice assistants like Siri, Cortana, and Alexa.

This consistent entanglement of women, technology, voice and service labour both stems from and is entirely embedded within patriarchal and capitalist structures. […]
The overheard automated female-sounding voices do not match the vocals of those in positions of power and authority. Instead, their controlled, mechanised pitch becomes the ultimate capitalist worker, replayed over and over, it can perform its duties endlessly. […]
[R]egardless of the chosen pitch, removing or replacing them is not enough. It is the power structures and systems behind these voices which must be confronted.

Asides

Your Futures Thinking Observatory