Last week Amazon got hit by one of the worst privacy violation scandals to affect it in its 23-year history.
The scandal concerned a report out of Oregon about a Portland woman whose private conversations with her husband were recorded and then shared by Alexa, the virtual assistant housed within Amazon’s Echo smart speakers.
Speaking with local media late last week, Danielle explained that about two weeks ago she and her husband received a call from one of her husband’s employees telling them to immediately unplug their Alexa device.
When they asked the employee why, he claimed that he had received audio files of recordings from inside their home via their Echo unit.
“At first, my husband was, like, ‘No, you didn’t!'” Danielle said. “And the (recipient of the message) said, ‘You sat there talking about hardwood floors.’ And we said, ‘Oh gosh, you really did hear us.'”
He sure did.
Here’s how the Echo works: It contains a virtual assistant, Alexa, that’s capable of performing tasks for the device’s owners. It can record conversations, initiate certain types of calls, play music, purchase products, share the news and much more.
The beauty — and curse (to be explained shortly) — of this device is that it’s designed to be operated with voice commands. You simply tell Alexa what you want: “Alexa, tell me today’s weather forecast.” To turn it on, you likewise simply say, “Alexa.”
It’s also a curse because Alexa sometimes misinterprets people’s voice commands, which is exactly what happened in Danielle’s case.
In a statement to The Verge, an Amazon spokesperson confirmed that Danielle’s “Echo woke up due to a word in background conversation sounding like ‘Alexa.'”
“Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’.”
And just like that Danielle’s conversations with her husband were transmitted to a man who reportedly lives 176 miles away from them.
“I felt invaded,” she complained to local media. “A total privacy invasion. Immediately I said, ‘I’m never plugging that device in again, because I can’t trust it.'”
She’s not the only one who feels this way:
I will NEVER use Alexa etc b/c these r just a gateway for anyone to b easily spied upon. We have one and it has been unplugged and will never b plugged in again.
— #Resist #Medicare4All #BLUENOMATTERWHO (@GOPAreCorrupt) May 26, 2018
What??? Never going to use an Alexa again. What a violation of trust. @amazon https://t.co/uDuBTeXG1V
— Sarah Russo, JD, Esq. (@Solano_Law) May 25, 2018
Amazon reportedly offered to disable the communication functions of Danielle’s Echo device so she could continue using its other features, but she refused. Instead she hopes to receive a refund, though Amazon has reportedly been hesitant about providing her with one.
Apparently, neither quality customer service nor respect for privacy are things that concern Amazon. That’s good to know …
DONATE TO BIZPAC REVIEW
Please help us! If you are fed up with letting radical big tech execs, phony fact-checkers, tyrannical liberals and a lying mainstream media have unprecedented power over your news please consider making a donation to BPR to help us fight them. Now is the time. Truth has never been more critical!
- Biden snubs Netanyahu thus far after big election win, but call expected ‘soon’ - November 6, 2022
- RNC chair flips CNN ‘election denier’ narrative: Dems are ‘inflation deniers… crime deniers’ - November 6, 2022
- Tudor Dixon goes scorched earth on Stephen Colbert after ‘apology’ for doubting Muslim parent is real - November 6, 2022
We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.