AI changing lives. D&D is cool. Surveillance. A studio for ideas. Undersea cables. Right to repair. GoT. — No.75

This week: AI changing lives. D&D is cool. Surveillance. A studio for ideas. Undersea cables. Right to repair. GoT.

A year ago: Cities need a modern-day Hanseatic League to get bargaining power on self-driving cars.


How will AI change your life?

Really really good interview with the AI Now Institute founders Kate Crawford and Meredith Whittaker, covering many of the worries and problems around AI and ethics. Notice how they always circle back to data sources and whether we are asking the right questions. Nothing massively new but it recaps and grasps the scope of the problem—as can be understood from the large number of quotes below (!!). You might also want to watch the Screening Surveillance videos further down right after reading this, perfect match.

[I]n the end, you’re talking about cultures of data production and if that data is historical, then you are importing the historical biases of the past into the tools of the future. […]

“Okay, how do we think about data construction practices? How do we think about how we represent the world and the politics of AI?” Because these systems are political, they’re not neutral. They’re not objective. They are actually made by people in rooms. And that’s why it matters who’s in the room, who’s making the system, and what types of problems they’re trying to solve. […]

[H]ow are you measuring benefit? And that’s one of the key areas I think we need to look at more closely, right? So in increasing crop yield, that might be a huge benefit, but is that coming at the expense of soil health? Is that coming at the expense of broader ecological concerns? Is that displacing communities that used to live on that land? I’m sort of making up these examples as questions you’d want to ask before you sort of claim blanket benefits from these technologies. […]

These issues are serious and they’re being taken seriously. But what we don’t see is real accountability. What we don’t see are mechanisms of oversight that actually bring the people who are most at risk of harm into the room to help shape these decisions. […]

They’re having different conversations about privacy that realize that it’s not just about individual privacy, it’s about our collective privacy. It’s the fact that, if you make a decision in a social media network, that can affect how data from all of your contacts is being extracted, as well. I think there’s an increasing level of literacy, and that’s something that’s super important. […]

The US has many similar systems that are either in place or about to be in place in the next couple of years. I’m sure you read the news that, for example, in New York, insurers have been given full permission to look at your social media to decide how to modulate your insurance rates. That sounds very similar to the sorts of things that we’re concerned about in China. […]

[🔥] I’d say this field has worshiped at the altar of the technical for the better part of 60 years, and at the expense of understanding the social and the ethical. We’re seeing the fruits of that prioritization. […]

Meredith Whittaker: I think he’s [Elon Musk] wrong across the board. I think the premise is faulty, but it is a great distraction from the very real harms of faulty, broken, imperfect, profitable systems that are being mundanely and obscurely threaded through our social and economic systems.

Kate Crawford: We’ve called this the apex predator problem, which if you’re already an apex predator and you have all the money and all the power in the world, what’s the next thing to worry about? “Oh, I know! Super-intelligent machines, that’s the next threat to me.” But if you’re not an apex predator, if you’re one of us, we’ve got real problems with the systems that are already deployed, so maybe let’s focus on that.

Related: Hey Google, sorry you lost your ethics council, so we made one for you.

Why the Cool Kids Are Playing Dungeons & Dragons

The de-geeking and mainstreaming of D&D seems to be complete and Annalee Newitz has an excellent take on why she’s playing again, partially in response to the awfulness of social media. (Also, there’s a bunch of “nerdy-ass voice actors” playing D&D on Twitch and some of their videos on YouTube have 10 million views and they raised $12 million on Kickstarter for an animated series. 🤯)

“Yes, I’m going to get together with people face-to-face, without any hearting or retweeting, and we’re going to eat chips and fight those damn cultists who are trying to resurrect the evil, five-headed dragon queen Tiamat.” […]

But D&D isn’t only about inventing a more badass version of myself, with wings and magic powers instead of sneakers and a laptop. I was also drawn to the idea of building a social group whose baseline assumption was that we’d see one another regularly. There’s a sense of purpose to the gathering.

Screening Surveillance

Three short films raising important issues of surveillance, set a few months (days? minutes? arrived?) in the future(s). Since this is an academic project, I also found it quite interesting that they provide facilitator guides and media packs for use in class and to publicize events people might hold to discuss the stories and issues. Quite secondary to their message but I loved the way they represent screen interactions and would love for my voice assistant to be named Zola 🙃.

In all aspects of life, personal information is collected and analyzed by organizations that produce various outcomes—surveillance is not simply good or bad, helpful or harmful, but it is never neutral. These three short films were created to raise awareness about how large organizations use data and how these practices affect life chances and choices. We need to consider these implications, and critically examine the logics and practices within big data systems that underpin, enable, and accelerate surveillance.

Muse: designing a studio for ideas

I’m always writing about, or alluding to, the “crazy” or “red string” walls for thinking through things so this concept for an app that would be a “studio for ideas” appeals to me a lot, and I tend to like the implementation. I haven’t spent much time considering options so I’d be happy to read more and your comments on this. (I do wish they had a longer narrated demo in addition of all these short snippets.)

Creativity is about making connections. This seems to demand a freeform, fluid space where creative fodder can be mixed together and sorted to the user’s liking. So why are freeform environments so rare in digital workspaces? […]

The studio is a place to collect raw material as input to your thinking. This means everything together, with no media silos. If you get a critical mass of documents into the studio, connections will naturally form. These connections produce new ideas that can be captured in the studio—a virtuous cycle producing yet more fodder for generation of future ideas.

The future of undersea Internet cables: Are big tech companies forming a cartel?

The GAFA spreading their influence ever further, building their moats by laying down their cables.

In the last five years, the cables that are partly owned by Google, Facebook, Microsoft and Amazon has risen eight-fold, and there are more such cables in the pipeline. These content providers also consume over 50% of all international bandwidth and TeleGeography projects that by 2027 they could consume over 80%. […]

Experts on antitrust argue that in the age of big tech, consumer welfare should not be the only factor taken into consideration while identifying antitrust behaviour. They suggest that any company stifling competition or making a ‘transaction motivated at the time by avoidance of competition is a good candidate for divestiture after the fact.’ Does the foray by these content providers into undersea cables classify as antitrust behaviour and should there be regulations to prevent this from happening?

Miscellany