Apple Contractors Regularly Hear Confidential Medical Information, Drug Deals and More While Grading Siri

watchOS 5 Siri Face

A report from The Guardian claims that private contractors working on Siri frequently hear confidential medical information, drug deals, and recordings of couples having sex. They do this as a part of quality control a.k.a “grading” in which they listen to collected voice data and help improve the Siri voice experience.

The revelation was made by a whistleblower working for one such private contractor. He expressed his concerns over the lack of disclosure from Apple and how frequently Siri picks up confidential and sensitive data. Siri can be triggered by saying “Hey Siri,” but there are other ways to trigger it as well. This includes raising the Apple Watch Series 4 which will automatically trigger Siri if it detects any speech. Similarly, Siri often confuses its wake word with the sound of a zip leading to a false trigger.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

The accompanying data can be used to pin the recorded voice snippet to the original user which is a breach of privacy. In the wrong hands, the data can be misused in many ways. On its part, Apple says that all Siri recordings are assigned a random identifier and any location or contact data is removed from them.

The contractor also mentioned that HomePod and the Apple Watch were the primary sources for mistakes recordings, with accidental triggers on the watch being “incredibly high.” Apple encourages contractors to report instances of false activations as “technical problem,” but there are no guidelines from the company on how to deal with sensitive recordings.

The contractor argued Apple should reveal to users this human oversight exists – and, specifically, stop publishing some of its jokier responses to Siri queries. Ask the personal assistant “are you always listening”, for instance, and it will respond with: “I only listen when you’re talking to me.”

Apple does not mention in Siri’s terms and conditions that it sometimes passes Siri recordings to third-party contract workers to help improve the voice assistant. Its terms and conditions only say that recorded voice snippets are used for analysis purpose to help improve Siri. In response to Guardian’s report, Apple also said that less than 1% of daily Siri activations are used for grading purposes and they are usually only a few seconds long.

Apple told the Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Our Take

Apple is not the only company that employs third-party contractors to improve its voice assistant. Google, Microsoft, and Amazon also send voice recordings to contractors for grading purposes. Unlike other companies though, Apple says that it strips all user data from the voice recordings. However, I am not sure if that alone is going to be enough in this scenario.

If you are uncomfortable with Siri sending voice recordings to contractors, you can simply disable the “Hey Siri” wake word or trigger on your Apple Watch and HomePod.

[Via The Guardian]