Google AI Duplex: What are the ethical concerns?

11.05.18

Google surprised us all this week with the introduction of its new artificial intelligence system, Google Duplex.

Unlike Siri, Alexa and other AI assistants, the system speaks like a convincing human. What’s more, it’s designed to make phone calls on your behalf, such as booking a table at a restaurant or an appointment at a hair salon.

“I think AI Duplex is a cool technology,” UNSW Canberra artificial intelligence expert and SEIT Professor Hussein Abbass says.

“I certainly do not want to waste my time making phone calls to book a restaurant or a hairdresser.”

A conversation with Duplex flows, “ums” and “mmm-hmms” are thrown in for good measure, and it’s probable the person on the receiving end has no idea they are speaking to a machine.

And for many this poses an ethical concern. Should Duplex introduce itself as a robot?

Professor Abbass says it should.

“The key point here is that it should identify itself,” Professor Abbass says.

“In the video, AI Duplex said it is calling on behalf of a client. The human on the other side of the line never asked it to identify itself. If AI Duplex uses a human name to gain human trust, then clearly there is an element of deception and the AI would be acting unethically.

“If AI Duplex identifies itself as an automated robot or the alike, then it will be up to the human to decide if he/she wishes to continue the conversation or not.”

Professor Abbass says AI Duplex collects human voice signatures that continue to train the robot and even its own voice seems to be a synthesis of one or more human voices.

“These humans need to be more aware how their data get used and they need to offer explicit consent for the collection of the data and for the intended use of the data,” he says.

If we can trust Duplex to make our hairdresser appointments, can we also trust it to make an emergency call? Professor Abbass says this could have its pros and cons.

“A car assistant could call the ambulance in the case of an emergency like an accident, and in this case, it is a life-saving technology,” he says.

If it gets misused, then like any technology, when it is misused, it is an inappropriate act, which could be even an illegal one.”

Professor Abbass says the technology has a long way to go before it is able to sustain lengthy human discussions.

“Nevertheless, this technology is an excellent example of why the technology needs to be ethically aware. If a human asks the assistance to make an inappropriate call, AI Duplex should say ‘no’!”

 

news