// Privacy NEWS // Siri: Apple Also Listen To Our Conversations

in #news5 years ago

The company's contract employees are responsible for listening to certain recordings recorded by Siri. These include doctor-patient conversations, drug trafficking and sexual relations.

3500.jpg

Apple also listens at the doors. Like Amazon and Google previously, the Cupertino company is accused of making humans listen to the conversations Siri recorded.

These people are not Apple employees, but third-party contractors who are responsible for analysing voice assistant malfunctions.

According to a whistle-blower wishing to remain anonymous and working in one of these companies, the snippets of conversations listened to by these people are recorded following an unsolicited Siri call.

Those who own an iPhone, Apple Watch or even a HomePod have often experienced this phenomenon: the assistant can launch himself when you pronounce a word or a sentence with a sound close to "Hey Siri".

Disturbing conversations listened to

Whoever says unsolicited triggering, therefore, means conversation that is supposed to remain confidential. That's when the situation gets awkward.

The Guardian therefore explains that the subjects heard on the recordings should never be listened to by a third party. The contract workers in charge of listening to them may have listened a consultation between a patient and his doctor, a business discussion, drug trafficking and even... sexual intercourse.

According to the whistle-blower, the conversations were recorded from all Siri equipped devices: iPhone, iPad, Mac, HomePod or Watch.

According to him, the latter is particularly prone to unintentional triggering of the voice assistant. Indeed, the watch can listen to a request as soon as the screen lights up, without the user having to say "Hey Siri".

No identification possible according to Apple

Apple is trying to reassure its customers by explaining to the British daily that these recordings listened to by humans represent less than 1% of Siri's daily activations.

On the other hand, no Apple IDs are associated with these conversations, making it impossible to identify the person speaking.

However, Apple does not specify anywhere in its terms of use that humans can access these recordings.
To prevent this, the best solution is to disable the "Hey Siri" Detect feature from the Settings section of your devices.

On the Apple Watch, you can also disable "Wrist raise" to prevent Siri from listening as soon as the watch screen turns on.

On HomePod, it is also possible to limit Siri, even if it is the main interaction method.
In recent months, both Google and Amazon have also been singled out to do the same with their Assistant and Alexa.

These processes also raise the question of artificial intelligence, which seems not to be able to really exist by itself and without the intervention of humans.

The latter are still the only ones who can precisely analyse certain elements, such as in these cases, to understand why the assistant was triggered when it had not been requested.

Source: The Guardian

Stay Informed, Stay Safe

DQmdpsoEfLe5nRg4Q1oKWHNjLdMnAucCYfRou1yF5Yiwrzs.png

DQmNuF3L71zzxAyJB7Lk37yBqjBRo2uafTAudFDLzsoRV5L.gif

Coin Marketplace

STEEM 0.30
TRX 0.12
JST 0.033
BTC 64038.31
ETH 3137.77
USDT 1.00
SBD 3.86