‘Unlikely’ string of events sees Amazon Alexa go rogue


An American woman says she feels “invaded” after an Amazon Alexa device recorded a private conversation and sent it to a random contact without being asked to.

US news outlet KIRO 7 reported that a woman, identified only as Danielle from Portland, Oregon, had been unaware of what happened until she received a phone call from her husband’s employee.

The employee said that Alexa, Amazon’s popular voice assistant, had recorded the family’s conversation and sent it to him.

Luckily, the conversation was not too personal – it was about hardwood floors.

Nonetheless, Danielle said she felt “invaded”.

She added: “Immediately I said: ‘I’m never plugging that device in again, because I can’t trust it’.”

An Amazon.com Inc driver stands next to an Amazon delivery truck in Los Angeles, California, U.S. on May 21, 2016
Image:
Amazon says it is ‘evaluating options to make this case even less likely’

Amazon confirmed the woman’s conversation had been inadvertently recorded and sent, blaming an “unlikely” string of events for the error.

Alexa starts recording after hearing its name or another “wake word” chosen by users, meaning that even having a TV switched on can result in the device being activated.

Amazon said this was what happened to Danielle, adding: “The subsequent conversation was heard as a ‘send message’ request.

“At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list.

“We are evaluating options to make this case even less likely.”



Unprompted, creepy laughter from Alexa is freaking out Echo users
Amazon knows about the bug and is working to fix it.




0:09

Video:
March: Echo spooks users with creepy cackle

Amazon wants Alexa to become a popular home accessory, used for everything from dimming the lights to ordering a pizza, but to achieve this, it must be able to assure users of the device’s security.

There were fears raised after US researchers found in 2016 that sounds unintelligible to humans could set off voice assistants.

According to The New York Times, the group showed that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on flight mode or open a website.

In May, some of those researchers went further, saying they could embed commands directly into recordings of music or spoken text.

This would mean that, while a human listener hears an orchestra, the voice assistant might hear an instruction to add something to your shopping list.



Source link

Related posts

Leave a Comment