Hard pass on ghost Alexa.

Amazon founder Jeff Bezos provides the keynote address at the Air Force Association's Annual Air, Sp...
Photo by JIM WATSON/AFP via Getty Images
Impact
Updated: 
Originally Published: 

Amazon’s Alexa will soon speak to you in your dead grandma’s voice, if that’s a thing you want

Amazon’s Alexa is getting set to become the world’s most upsetting impersonator. At the company’s re:MARS conference — named as such because all of its announcements will make you want to leave the planet — Amazon showed off a new, experimental feature that will allow its popular voice assistant to copy anyone’s voice after listening to just a few short audio clips.

There are a lot of reasons that this is a terrible idea, but Amazon was apparently insistent on finding the creepiest possible use case for the technology. When Rohit Prasad, Amazon’s senior vice president and head scientist for Alexa, took the stage to show off the feature, he pitched it as a way to hear the voice of your dead relatives. In the demo, Amazon showed a young boy listening to Alexa read him a bedtime story in the voice of his deceased grandmother.

Amazon really hammered home this idea that having Alexa serve as a parrot for your grandma’s voice was super touching and sweet and in no way dystopian and disturbing. Per Reuters, Prasad said the technology was designed to help “make memories last,” a response to the fact that “so many of us have lost someone we love” during the pandemic.

The whole thing is pretty fucked up for a number of reasons, not the least of which is the fact that Amazon has basically taught Alexa how to deepfake anyone’s voice. The company claims all it needs is about one minute of audio of a person talking to do a serviceable impression of their voice.

Then there’s the simple fact that while Amazon may present this as some attempt to make memories of a loved one, this isn’t actually a memory of them — it’s a fake version of them. Maybe this seems like a semantic thing, but it is literally not the voice of the person you loved. It is the sounds of that person’s voice, converted into a bunch of 0s and 1s and transmitted back to you mapped out on top of a machine’s dialogue trees.

That’s weird! If you want to remember someone, you can listen to the audio recording of them. What Alexa is doing is something different. It’s making them into something else entirely, and it raises lots of questions. Can you make your loved one say something they never would? And does it actually help anyone to hear those things?

There very well might be people who find comfort in Alexa doing an impression of a lost relative’s voice, even with the tin can affect that robotic voices all have. But it’s more troubling that this technology exists at all, and that Amazon is basically offering to store and manipulate the voices of your loved ones as if that’s some sort of altruistic feature that people have been clamoring for.