Google announced their Duplex service to a chorus of ‘wows’, but are there loads of ethical questions that the company isn’t considering?
Amidst all the talk about Artificial Intelligence, there is always the fear, partially warranted, of which jobs might be superseded by automation and when.
Google Duplex is a recently announced service that can make basic phone calls for you. Using a number of different voices, it can call a hairdresser or restaurant and book an appointment or reservation. The voice, intonations and responses are uncanny, to the point that the interlocutors in the demo didn’t realize they were having a conversation with a computer. The launch of Duplex brought to light a couple of issues surrounding AI’s arrival in our lives.
First, something we’ve known for a while; that ethics and privacy will need to play a big role in the development of advanced algorithms, something that has been shown time and again to not be a priority for Silicon Valley companies. Rene Ritchie, an “Apple watcher” who also thinks about technology in general, produced a fantastic short video about Google’s missed opportunities. Instead of going with the purely technical demo, they could have made a point of reflecting on the need for ethics research and influence in development, and could have addressed the responsibility they have — and hopefully take — in introducing such tools to the world. They didn’t, electing to use the announcement to endlessly repeat the term AI and flaunt their technical accomplishment.
The second issue relates closely to jobs being taken by bots. In the Duplex demo, as Chris Messina put it, you can see the human on the phone as an API. An Application Programming Interface (API) is a way for two computers or services to talk. Messina’s framing points out that there is a computer making the call on one side, a computer on the restaurant’s side taking the reservation, and a human — to put it bluntly — simply being used for voice recognition and data entry. That person hasn’t lost his or her job… yet, but is also being deceived and used purely as an interface.
AI is coming and it’s disconcerting to see so few companies stepping up to their responsibilities and giving ethics and privacy the attention they deserve. So far, Apple is taking a stand, but who will follow? Who else will carefully consider the impact of what they build, not just on their bottom line but on society?
Originally written for CloudRaker Thoughts.