Amazon’s Alexa Device IS Listening To Your Most Intimate Moments — Their Explanation Is Absurd

A team of Amazon analysts is listening to 1,000 or more voice recordings made in private homes daily, and the company says there’s virtually nothing you can do about it.

The company, owned by multi-billionaire Jeff Bezos, who also owns the Washington Post, says that having workers listening to private voice recordings is simply to improve the voice-assistant Echo’s ability to understand human speech.

“Full-time workers and contractors at the online retail giant reportedly sift through as many as 1,000 recordings per shift – and even share ‘amusing’ clips between themselves,” The Sun reports.

 

Seven former employees who worked in Amazon’s voice review programme came forward with the startling revelations, reported by Bloomberg.

They said the recordings – made through Amazon devices like the Echo speaker – are linked to the customer’s first name, their device’s serial number and even their personal account number.

Among the clips listened to by the former employees was a recording of a woman singing in the shower.

Another recorded the sound of a child screaming – while a third recorded a sexual assault taking place, the employees said.

The company has admitted to its customers that thousands of recordings are being analysed by staff and transcribed before feeding them back into the software.

As many as 1,000 clips are reviewed by workers in buildings all over the world, many of which do not bear any obvious indication that they are run by Amazon.

Among more sinister content the workers have heard, have been a child screaming for help and two instances were they believed they heard a sexual assault taking place.

We got in touch with Amazon through its website chat, and here’s what one staffer sent us.

“We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone,” the worker wrote.

“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. While all information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it, customers can delete their voice recordings associated with their account at any time.”

Deleting recordings means going into settings and clicking on each one individually, then selecting delete. While that deletes the message from your app, it’s unclear if another copy of it exits on the cloud or elsewhere.


And if anyone who own an Echo, which uses the voice assistant “Alexa,” wants to return their product, they might be out of luck. If the unit is out of the “return window,” there’s no recourse, a staffer there said.

You Might Like