It's no secret that we're living in an age in which tech companies are constantly collecting troves of data about us, from the sites we visit and the shows we like to our political leanings. It's what makes plenty of people uneasy about buying always-listening devices like Amazon's Alexa or Google Home. It's not like any real people on the other end are actually listening to what we say and do, though, right?
Hate to be break the bad news, but it appears that plenty of real live humans are hearing what you do sometimes, including highly private moments like when you're having sex, according to a new report by The Guardian. People working for Apple reportedly stumble upon confidential moments picked up by the tech company's virtual assistant, Siri.
Contract employees working for Apple to help make Siri more sophisticated regularly hear highly sensitive recordings collected by the built-in assistant around the world, per the report, which is based on a whistleblower source. The workers are tasked with helping to grade Siri's responses to queries by reviewing a random assortment of Siri activations, and thus becoming privy to a wide range of recorded interactions from users. The report alleges that the voice files often contain private stuff regarding everything from confidential medical information to drug deals, and even audio of people having sex.
If you have "Hey, Siri" enabled on your iPhone, you may have noticed it accidentally activates when you utter something that sounds even remotely like "Hey, Siri," the virtual assistant's "wake word." It can be frustrating, for sure, which is part of the reason these contractors exist and analyze recordings. It also means you may have inadvertently had one of those interactions recorded and screened, according to the report.
To be clear, not everyone's Siri interactions are being screened by humans. In fact, very few are. In response to The Guardian's report, Apple clarified that less than 1% of Siri activations are analyzed by this particular team and the recordings are typically only a few seconds long. Apple also claims that the identities of users whose recordings are screened are untraceable.
“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements," Apple told the paper.
Still, the anonymous whistleblower source who worked as part of this screening team said sensitive recordings are still linked to some user data.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data," the person said, per the report. “Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
There's no proof that anyone ever has used this sort of information with nefarious intentions, but if the slight chance that it's possible has you a bit uncomfortable, you can disallow Apple from hearing any of your Siri recordings going forward, according to 9to5Mac. The only catch is you have to download a bit of software, courtesy of privacy advocate Jan Kaiser. If you're comfortable with that, check out the instructions below.
How to stop Apple from listening to your Siri recordings
Step 1: Open this configuration profile using your iPad or iPhone via GitHub.
Step 2: Scroll until you see the "direct link" and download it
Step 3: Complete and review the installation in Settings by tapping Install
h/t The Guardian, 9to5Mac