Note — Sep 16, 2018

Yuval Noah Harari on Why Technology Favors Tyranny

Seen in → No.49

Source → theatlantic.com/magazine/archive/2018/10/yu...

Trigger warning: Bleak af abyss gazing.

This article is an adaptation from Yuval Noah Harari’s book, 21 Lessons for the 21st Century and it as full of insights as it is depression inducing. I tend to put very little credence in killer AI scenarios but this much more plausible version is way more worrying. I do find it strange though that he mentions AlphaZero “learning by itself” and puts emphasis on the importance of massive amounts of data. It’s still unclear to me how much data AIs in a few generations will actually need and it’s a bit disappointing that he mentions nothing along those lines.

Below are some choice quotes and below that some of the main takeaways which are also kind of spoilers, I encourage you to read the whole thing first.

Four hours. For centuries, chess was considered one of the crowning glories of human intelligence. AlphaZero went from utter ignorance to creative mastery in four hours, without the help of any human guide.

By 2050, a useless class might emerge, the result not only of a shortage of jobs or a lack of relevant education but also of insufficient mental stamina to continue learning new skills. […]

We should instead fear AI because it will probably always obey its human masters, and never rebel. AI is a tool and a weapon unlike any other that human beings have developed; it will almost certainly allow the already powerful to consolidate their power further. […]

The main handicap of authoritarian regimes in the 20th century—the desire to concentrate all information and power in one place—may become their decisive advantage in the 21st century. […]

For starters, we need to place a much higher priority on understanding how the human mind works—particularly how our own wisdom and compassion can be cultivated. […]

So we had better call upon our scientists, our philosophers, our lawyers, and even our poets to turn their attention to this big question: How do you regulate the ownership of data?

  • People vital to the economy but lacking power vs still somewhat vital people fearing their loss of power.
  • The advantage of connectivity and updatability for software vs people.
  • AIs who never disobey their human masters and never rebel (might be worse than killer AIs).
  • The West Bank is already a primitive preview of a total surveillance regime.
  • Cambridge Analytica sentiment analysis to the power of ten for behaviour control.
  • Dictatorships’ information control might be a competitive advantage.
  • We are already training ourselves to hand over control to machines.
  • We must put higher priority on understanding our minds; as much as we are investing in AI.