Home technology Amazon Alexa unveils new technology that can mimic voices, including the dead

Amazon Alexa unveils new technology that can mimic voices, including the dead

Placeholder while article actions load

Propped atop a bedside table during this week’s Amazon tech summit, an Echo Dot was asked to complete a task: “Alexa, can Grandma finish reading me ‘The Wizard of Oz’?”

Alexa’s typically cheery voice boomed from the kids-themed smart speaker with a panda design: “Okay!” Then, as the device began narrating a scene of the Cowardly Lion begging for courage, Alexa’s robotic twang was replaced by a more human-sounding narrator.

“Instead of Alexa’s voice reading the book, it’s the kid’s grandma’s voice,” Rohit Prasad, senior vice president and head scientist of Alexa artificial intelligence, excitedly explained Wednesday during a keynote speech in Las Vegas. (Amazon founder Jeff Bezos owns The Washington Post.)

The demo was the first glimpse into Alexa’s newest feature, which — though still in development — would allow the voice assistant to replicate people’s voices from short audio clips. The goal, Prasad said, is to build greater trust with users by infusing artificial intelligence with the “human attributes of empathy and affect.”

The new feature could “make [loved ones’] memories last,” Prasad said. But while the prospect of hearing a dead relative’s voice may tug at heartstrings, it also raises a myriad of security and ethical concerns, experts said.

“I don’t feel our world is ready for user-friendly voice-cloning technology,” Rachel Tobac, chief executive of the San Francisco-based SocialProof Security, told The Washington Post. Such technology, she added, could be used to manipulate the public through fake audio or video clips.

“If a cybercriminal can easily and credibly replicate another person’s voice with a small voice sample, they can use that voice sample to impersonate other individuals,” added Tobac, a cybersecurity expert. “That bad actor can then trick others into believing they are the person they…