Pocket-lint is supported by its readers. When you buy through links on our site, we may earn an affiliate commission. Learn more

(Pocket-lint) - Apple has issued a formal apology for employing human contractors to listen to audio recordings of its users talking to Siri.

The company said these contractors were meant to improve users' experiences with its digital assistant. However, now that the media has revealed this practice, sparking privacy concerns among users across the world, Apple is admitting it didn't live up to its "high ideals". Actions speak louder than words, though, right?

So, in that case, Apple is making several changes to Siri's privacy policy.

Pocket-lintHow Apple is changing Siri in aftermath of audio recordings controversy image 2

Why is Apple sorry?

Apple was recently caught using human contractors to review recordings from Siri - something it never made explicitly clear to customers. The Guardian said those contractors had access to voice clips that were often recorded due to accidental Siri triggers. Workers reportedly listened to up to 1,000 recordings a day, and many of the clips were long enough to hear private information.

These 3 cases will keep your iPhone 13 slim, protected, and looking fantastic

Apple called this process "human grading of Siri requests". Keep in mind Google, Amazon, Facebook, and Microsoft have also all recently been caught using human contractors to review recordings from their assistants, apps, and services.

How is Siri's privacy policy changing?

Apple said it now plans to change Siri's privacy policy. Here's how:

  1. By default, Apple will no longer retain audio recordings of Siri interactions. It will continue to use computer-generated transcripts, however.
  2. Users will be able to opt-in to a feature that will allow Siri to "learn from" the audio samples of user requests. Apple hopes people will opt-in, knowing that it "respects their data and has strong privacy controls in place". Users can also opt-out at any time.
  3. Only Apple employees will be allowed to listen to audio samples of Siri interactions. Apple is therefore no longer using human contractors, and said its team will "work to delete any recording which is determined to be an inadvertent trigger of Siri".

Previously, Apple didn't make clear who or what was listening to Siri recordings. It would also keep recordings for up to six months, after which it would remove identifying information and would keep them for two years or more. Now, Apple will no longer keep audio recordings from Siri unless a user opts-in, and then in those instances, only Apple employees will have access (rather than contractors).

The company will also try to delete recordings of accidental triggers.

When will this change happen?

Apple has halted Siri grading. It plans to resume this autumn - after its new privacy policy and a software update are rolled out to users.

Writing by Maggie Tillman. Originally published on 28 August 2019.