Amazon has revealed an experimental Alexa feature that allows the AI assistant to imitate the voices of users’ dead family members.
The company demoed the feature at its annual MARS conference, showing a video in which a child asks Alexa to read a bedtime story in her dead grandmother’s voice.
“As you saw in this experience, instead of Alexa’s voice reading the book, it’s the child’s grandmother’s voice,” said Rohit Prasad, Amazon’s chief scientist for Alexa AI. Prasad introduced the clip by saying that adding “human attributes” to AI systems was increasingly important “in these times of ongoing pandemic, when so many of us have lost someone we love.”
“While AI can’t take away the pain of loss, it can definitely make your memories live on,” Prasad said. You can see the demo itself below:
Amazon has given no indication if this feature will ever be made public, but says its systems can learn to imitate someone’s voice with just a minute of recorded audio. In an age of abundant video and voice notes, this means it’s within the reach of the average consumer to clone the voices of their loved ones, or anyone else they choose.
Although this specific app is already controversial, with social media users calling the feature “creepy” and “monstrosity,” such AI voice imitation has become increasingly common in recent years. These knockoffs are often referred to as “audio deepfakes” and are already regularly used in industries like podcasting, film and television, and video games.
Many audio recording suites, for example, offer users the option to clone individual voices from their recordings. That way, if a podcast host fails along the lines of him, for example, a sound engineer can edit out what he said simply by writing a new script. Replicating smooth voice lines is a lot of work, but very small edits can be made with just a few clicks.
The same technology has also been used in movies. Last year, it was revealed that a documentary about the life of chef Anthony Bourdain, who died in 2018, used AI to clone his voice to read quotes from emails he sent. Many fans were upset by the application of the technology, calling it “macro” and “misleading”. Others defended the use of the technology as similar to other reconstructions used in documentaries.
Amazon’s Prasad said the feature could allow customers to have “long-lasting personal relationships” with the deceased, and it’s certainly true that many people around the world are already using AI for this purpose. People have already created chatbots that mimic dead loved ones, for example, by training AI based on stored conversations. Adding accurate voices to these systems, or even video avatars, is entirely possible using current AI technology and is likely to go mainstream.
However, whether or not customers want their dead loved ones to be turned into digital AI puppets is another matter entirely.