Voice assistants and data protection – hidden bugs?

[ad_1]

Voice assistants and data protection – practical listeners or bugs?

A networked smart health home is not absolutely necessary in order to benefit from voice assistants. As part of an Android smartphone, Google Assistant searches for the shortest route to an address. Thanks to the Fire TV Stick on the TV, Alexa finds the right series with a voice command. Voice assistants can also be part of intelligent speakers, such as Siri in Apple HomePod, Alexa in Amazon Echo or Google Assistant in Google health home. But do voice assistants always listen and serve as modern bugs?

What means more comfort in everyday life for some people, raises questions about data protection and privacy for other users, because the microphones are usually always on with intelligent loudspeakers. For manufacturers, as with data protection in the smart health home, it often applies that the information collected should contribute to constant improvement. How does a voice assistant work and when is data sent to the server?

This is how voice assistants work in conjunction with intelligent loudspeakers on call:

  1. Microphones must be on all the time, since the activation word can be called at any time by users and voice assistants should then react promptly.
  2. When the microphones are on, there is usually no connection to the cloud and the data is not sent to the server.
  3. Only with activation words such as "Alexa", "Hey Siri" or "Ok Google" is a connection to the cloud established and the subsequent command or question sent to the server.

The problem: The voice assistants can understand similar words, for example from the television, as an activation word and then establish a connection to the cloud. This can cause discomfort in discussions or when communicating sensitive data.

In the following we show how manufacturers are concerned with the data security and privacy of users and give tips on what users can do themselves.

Voice assistants, data security and privacy

Amazon and Co. make no secret of the fact that data is evaluated and e.g. B. used for marketing purposes and for the continuous improvement of the services. However, data protection scandals are critical, in which conversations were unwittingly saved and forwarded as voice recordings. This can happen if the voice assistants mistakenly heard the activation word. In some cases, sensitive data is then stored not only directly at the manufacturer, but also at linked third-party providers on servers without users knowing about it.

According to the General Data Protection Regulation (GDPR), companies have to make transparent what content they store. In addition, users have the right to have data deleted and not saved permanently. Those who do not comply with the regulation face fines of up to four percent of the total annual turnover. Manufacturers, such as Amazon, are asking for permission to use messages to improve records.

Voice assistants and data protection – tips for your own data security & privacy

That the topic of data security and voice assistants is becoming increasingly important is a good sign – because your own privacy is a high value. Manufacturers have perceived privacy concerns of users and responded accordingly. Amazon and other manufacturers, for example, have installed an extra button on their speakers that users can use to switch off the microphones. The Smart Displays Echo Show are equipped with a manual slider so that the camera can be covered if required. Google and Co. offers users that they can delete recent voice requests via a history in the app. If you want to learn more about Amazon and data protection, click here for our Alexa data protection interview.

Many manufacturers offer a special button that allows users to manually turn off the microphones

Many manufacturers offer a special button that allows users to manually turn off the microphones

(Amazon)

The most important tips for your own data security and privacy with voice assistants:

  • When dealing with voice assistants, inform them in advance about the respective data protection declaration and provisions.
  • Turn off intelligent speakers or microphones if you have concerns or want to discuss sensitive matters. The more users use voice assistants, the more likely it is that sensitive data will be backed up.
  • Prevent the use of data to improve the wizards in the settings.
  • Regularly delete user requests in the history of the app. Caution: Anyone who does not want data to be used or the history to be deleted should bear in mind that this limits the individual use of the voice assistants, since the personalization is based on collected data.
  • Secure purchase orders with a code so that children do not accidentally order products while playing.
  • Inform the visitor that, for example, an intelligent loudspeaker is switched on and switch off if desired. Then visitors who have doubts do not have to worry about their data security.
  • When smart speakers are turned on, do not read out sensitive information such as bank details.
  • Prevent misuse of voice assistants by hackers or unauthorized persons. For example, secure the automatic opening of doors or garage gates by voice command with additional PIN entries. For more security, users should set up a secure WLAN.

Conclusion on voice assistants and data protection

Voice assistants and data protection are problematic because microphones from smart speakers and cameras from smart displays are usually on all the time. It is particularly difficult if sensitive information is stored on servers that are not named further without the knowledge of the user and if it is not clear when and where it is listened to. Manufacturers affirm that the data will only be sent to the cloud after the activation word.

Overall, users should be aware that voice assistants are artificial intelligence, whose functionality and added value increases with the amount of data collected. The utopia of maximum comfort provides that the digital assistants are very well tailored to the individual needs of the users. However, users should not have to pay for this convenience by being denied information about the processing of personal data. The GDPR provides security on this point, which requires companies to make data storage transparent. Another component depends on how users handle voice assistants and what precautions they take to ensure that no sensitive data ends up in the cloud and on servers.

AB SMART HEALTH health home & BUILDING REVIEW

Many manufacturers offer a special button that allows users to manually turn off the microphones
We will be happy to hear your thoughts

Leave a Reply

AB Smart Health
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0
Share via
Copy link
Powered by Social Snap