Voice assistants, such as Google Assistant and Alexa, record what you say after the watchword to send to the company's servers. Companies keep your records until you delete them. Some companies allow you to disable this behavior: here's how.
Voice assistants record you after waking up
Voice assistants work directly. They continually listen to everything you say, all day long. But the camera in the room does not have much intelligence. The only thing she can understand is the word that precedes it: Alexa, Hey Google, Hey Cortana, etc.
Once he detects this waking word, he starts recording all that follows (and about a second after he thought he heard the waking word). The device sends the recording to the company's servers (Alexa, Google, etc.) to understand everything you said and to follow through.
But after executing your order, companies do not necessarily delete your registration. Instead, your words are kept indefinitely for improve the results of the voice assistant and determine the new features.
Some companies allow you to disable this behavior. And some do not do it. Of these, disabling the recordings will completely break the voice assistant, but this is not always the case. We have gathered what you can do and what are the results.
Google is the leader in choice
Google is the only one who can choose Google Assistant. without storing your voice forever. And in a real step forward, now is the default behavior of new users that configure Google Assistant.
Existing users benefit from the old system of keeping your voice recordings, but you can turn it off. Disabling voice storage is as easy as using Google Activity Checks, turn off "Audio and Audio Activity," and then click Pause.
Best of all, turning off voice storage does not break Google Assistant devices or Google Home. So there is no reason not to disable this feature if you do not like big companies that keep copies of your voice.
Alexa does not give you a lot of choice
Amazon offers no equivalent Google option to prevent the storage of your voice recordings. If you use Alexa, from an Echo device or an Alexa app, your voice is processed and sent to Amazon servers. Amazon keeps your records to improve Alexa.
Your only options are listen to your recordings and delete them or give up the help of a device powered by Alexa. You can mute echo machines, but it is not necessarily a permanent solution. If someone else notices that the device is muted and turns it back on, you return to your starting point. And in any case, mute completely interrupts Alexa's ability to use, thus nullifying the interest of owning the devices.
Amazon provides a dashboard of privacy where you can tell the company not to use your voice recordings to develop new features or to improve transcripts. Just click on the "Manage how your data improves Alexa" option, then disable both toggles. But you'll notice that this tells Amazon not to use your data for both purposes; This does not prevent you from storing your records or using them for other purposes.
Update: Amazon now allows you to delete some recordings with your voice, as well.
I hope Google will follow the example of Google and come up with better options.
The only option of Cortana is an Off button
Similar to Amazon, Microsoft does not offer any options to prevent the storage of voice recordings. You can only view and delete existing records in Microsoft. dashboard of privacy.
Worse than Amazon, you can not even limit how Microsoft uses your recordings. The only real option is to turn off Hey Cortana completely. In the initial search bar, type "Talk to Cortana," press Enter, and then turn off Hey Cortana.
If you use a Cortana speakeryou should mute it. Of course, you totally give up Cortana. Therefore, if you want to use the voice assistant, you must currently accept that Microsoft saves your voice recordings for these purposes.
Siri at least removes your recordings when you turn it off
Apple provides you with the simplest way to delete your records and links for the least useful options to avoid saving at the beginning.
Just like Microsoft and Amazon, the only option to prevent Apple from storing your records is not to use Siri at all. To use Siri is essentially to agree to allow Apple to use your voice recordings for the purpose.
The good news is that, rather than having to search for a privacy dashboard, simply turning off Siri removes your records from the Apple server, provided you also disable the player.
To turn off Siri Dictation, go to Settings> Siri and turn off Hey Siri and Siri. Tap "Disable" in the prompt. Note that it is mentioned that records are always stored if dication is disabled.
Deactivate Dication in Settings> General> Keyboards and turn off dictation. Tap "Disable" in the prompt. Now he will confirm that the records will be deleted. (If you do this in the reverse order, warnings will be adjusted as appropriate).
Unfortunately, not all voice assistants are equal. Siri is known for the easiest removal of your records, but Google deserves to allow you to prevent storage while using Google Assistant. Hopefully they'll learn from each other (or, better yet, will steal from each other) and provide better granular controls for your data.