Amazon Alexa unveils new technology that can mimic voices, including the dead

Amazon Alexa unveils new technology that can mimic voices, including the dead

Placeholder while article actions load

Leaning on top of a Headboard table during this week’s Amazon tech summit, an Echo Dot was asked to complete a task: “Alexa, can Grandma finish reading ‘The Wizard of Oz’?”

Alexa’s typically cheery voice boomed from the kid-themed smart speaker with a panda design: “Okay!” Then, when the device began narrating a Cowardly Lion scene pleading for courage, Alexa’s robotic tone was replaced with a more human-sounding narrator.

“Instead of the voice of Alexa reading the book, it’s the voice of the child’s grandmother,” Rohit Prasad, Alexa’s senior vice president and chief AI scientist, enthused during a keynote in Las Vegas. (Amazon founder Jeff Bezos owns The Washington Post.)

The demo was the first look at Alexa’s newest feature, which, while still in development, would allow the voice assistant to replicate people’s voices from short audio clips. The goal, Prasad said, is to build greater trust with users by infusing artificial intelligence with the “human attributes of empathy and caring.”

The new feature could “make [loved ones’] the memories last,” Prasad said. But while the prospect of hearing a dead relative’s voice can be heart-touching, it also raises a host of ethical and safety concerns, experts said.

“I don’t feel like our world is ready for easy-to-use voice cloning technology,” Rachel Tobac, executive director of San Francisco-based SocialProof Security, told The Washington Post. Such technology, she added, could be used to manipulate the public through fake audio or video clips.

“If a cybercriminal can easily and credibly replicate another person’s voice with a small voice sample, they can use that voice sample to impersonate other people,” added Tobac, a cybersecurity expert. “That bad actor can trick others into believing they are the person they are impersonating, which can lead to fraud, data loss, account takeover and more.”

Then there is the risk of blurring the lines between what is human and what is mechanical, said Tama Leaver, a professor of Internet studies at Curtin University in Australia.

“You’re not going to remember that you’re talking to the depths of Amazon…and its data collection services if you’re talking to the voice of your grandmother or grandfather or a lost loved one.”

“In some ways, it’s like an episode of ‘Black Mirror,'” Leaver said, referring to the sci-fi series that imagines a tech-themed future.

The Google engineer who believes the company’s AI has come to life

The new Alexa feature also raises questions about consent, Leaver added, particularly for people who never imagined a robotic personal assistant would sing their voice after they died.

“There’s a real slippery slope there of using deceased people’s data in a way that’s both creepy on the one hand, but deeply unethical on the other because they’ve never considered those traces being used in that way,” Leaver said. .

Having recently lost his grandfather, Leaver said he empathized with the “temptation” of wanting to hear the voice of a loved one. But the possibility opens a floodgate of implications that society may not be ready to take on, she said, for example, Who owns the rights to the little bits that people leave in the ether of the World Wide Web?

“If my grandfather had sent me 100 messages, should I have the right to enter that into the system? And if I do, who owns it? So Amazon owns that recording? she asked her. “Have I given up the rights to my grandfather’s voice?”

Prasad did not address such details during Wednesday’s speech. However, he posited that the ability to imitate voices was a product of “certainly living in the golden age of AI, where our dreams and science fiction are becoming reality.”

This AI model tries to recreate the mind of Ruth Bader Ginsburg

If Amazon’s demo becomes an actual feature, Leaver said people might need to start thinking about how their voices and likenesses could be used when they die.

“Do I have to think about my will that I need to say: ‘My voice and pictorial history on social media is owned by my children, and they can decide if they want to revive that in chat with me or not? ‘ Leaver wondered.

“That’s a weird thing to say now. But it’s probably a question we should have an answer to before Alexa starts talking like me tomorrow,” she added.

Leave a Comment

Your email address will not be published.