Advertisement

Sign up for our daily newsletter

Advertisement

Amazon sent customer 1,700 audio files from a stranger’s Alexa

An archived data request from Amazon goes awry

Advertisement

An Amazon customer in Germany was surprised to receive 1,700 audio files from a stranger’s Alexa. The man had requested a copy of his archived data from the internet giant, only to be dumbfounded when he received over one thousand copies of someone else’s private conversations.

The customer had requested to review his data under the European Union’s new General Data Protection Regulation (GDPR) laws. Amazon complied – or so it thought – sending the man a download link to what they believed to be his data. The Alexa user was surprised to open some 1,7000 recordings from a stranger’s household.

Speaking to German trade magazine c’t, the customer revealed his shock. “I was very surprised about that because I don’t use Amazon Alexa, let alone have an Alexa-enabled device,” he said. “So I randomly listened to some of these audio files and could not recognise any of the voices.”

The contents of the recordings were predictably – understandably – intimate; the innocuous sounds of domesticity, showering, weather inquiries and music requests, revealing a lot about the user’s identity.

READ NEXT: Amazon reportedly wants us all to have Alexa home robots looking after our families from next year 

As for Amazon, it was quick to downplay the incident. Speaking to Reuters on Thursday, a company spokesperson said, “This unfortunate case was the result of a human error and an isolated single case.”

As for damage control, it was swift and comprehensive. “We resolved the issue with the two customers involved and took measures to further optimise our processes,” said the spokesperson. A pre-emptive own goal was also conceded: “As a precautionary measure we contacted the relevant authorities,” the spokesperson continued.

This isn’t the first time that commands to an Alexa have gone awry, ending up where they shouldn’t have. Last year saw a six-year-old order a $170 dollhouse without her parents’ consent. When the story made local TV news, the command – heard by Alexa devices across San Diego – was duly registered by those devices, with Amazon Echos citywide attempting to order replicas.

Read more about:

Advertisement

      <kbd id='MCINldspSDN1'></kbd><address id='MCINldspSDN1'><style id='MCINldspSDN1'></style></address><button id='MCINldspSDN1'></button>