X
Tech

Apple’s Siri overhears your drug deals and sexual activity, whistleblower says

Quality control frequently comes across recordings which should not have existed in the first place.
Written by Charlie Osborne, Contributing Writer

UPDATE 8/2/2019: Apple, Google: We've stopped listening to your private Siri, Assistant chat, for now 

Apple's Siri records private and confidential conversations and activities on a regular basis including talk relating to medical conditions, drug deals, and sex acts.

The Guardian reports that an unnamed whistleblower has brought the situation to light, in which contractors working for the iPad and iPhone maker regularly listen in on Siri interactions as part of their job grading the voice assistant. 

Staff members tasked with grading how Siri responds to commands and whether or not the correct wake word "Hey Siri" was used before a recording occurred often hear explicit recordings, which are accidentally saved when the assistant mistakenly associates a sound as the wake word. 

See also: Apple facial recognition tech prompts student to sue for $1 billion after false arrest

The publication's source notes, for example, that the sound of a zipper can be misconstrued as a demand to wake up. In what the whistleblower says are "countless instances," conversations between doctors and patients, business deals, and both criminal and sexual activity have been captured by the smart assistant. 

The Apple Watch, in particular, has come under fire. While many recordings captured by Siri may only be a few seconds in length, The Guardian says that the watch -- with Siri enabled -- may record up to 30 seconds. 

Apple says that less than one percent of activations are sent elsewhere for grading.

CNET: Satellites are starting to watch your every move

"A small portion of Siri requests are analyzed to improve Siri and dictation," Apple told the Guardian. "User requests are not associated with the user's Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements." 

In contrast, the source says that recordings "are accompanied by user data showing location, contact details, and app data." However, identifiers appear to be masked, which would mean connecting a specific recording to a particular user would be a challenge.

It is understandable that for any company developing a voice-based service to wish to capture some data to improve the quality of interactions and to both identify and rectify mistakes in their voice recognition technologies. However, as noted by Macworld, consumers would appreciate some way to opt-out of their data being used in this manner. 

TechRepublic: 60% of companies experienced insider attacks in the last year

The reports bring to mind past privacy issues with rival product Amazon Alexa. There have been cases when Alexa has also listened to private conversations and earlier this year it was revealed that human operators also monitor interactions for quality control purposes. 

In July, Amazon confirmed that voice recordings are held with no expiry date unless customers manually remove them. 

Many of 2018's most dangerous Android and iOS security flaws still threaten your mobile security

Previous and related coverage


Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0


Editorial standards