Pocket-lint is supported by its readers. When you buy through links on our site, we may earn an affiliate commission. Learn more

(Pocket-lint) - Microsoft contractors have been reviewing voice recordings from Skype’s auto-translation feature and the Cortana voice assistant.

Audio files shared with Motherboard captured users discussing sensitive, private topics ranging from relationship issues to weight loss. “The fact that I can even share some of this with you shows how lax things are in terms of protecting user data,” said the unnamed Microsoft contractor, who told Motherboard about the voice recordings. He even described hearing "phone sex" while reviewing clips.

If you use Skype or Cortana, you should check out whether Microsoft has kept your voice data, and you should probably delete it. Here's how.

Why do contractors review voice recordings?

Contractors are hired to review voice recordings for the purpose of improving Microsoft's voice services. In the company's Skype Translator FAQ and Cortana FAQ, Microsoft discloses that it analyses voice data "to build more performant services", though, as noted in the report from Motherboard, the company does not explicitly disclose that a human may listen to users' voice recordings.

While Microsoft contractors don't have access to user-identifiable information, the contractor speaking to Motherboard has heard people using the assistant to find porn and giving Cortana their addresses. These private conversations are laughed about by “random people sitting at home in their pajamas", the contractor added. Motherboard spotted job listings that show Microsoft's contractors do work from home.

In the case of Skype, where contractors review recordings from the service's machine translation service, they first manually translate each recording, then they are shown machine-generated translations, and they pick the most accurate. Microsoft told Motherboard its vendors agree to confidentiality agreements, and it has audit rights. It also only collects and uses voice data on an opt-in basis.

How to delete your Microsoft voice data

Microsoft offers a privacy dashboard, so you can delete any voice data the company may have on you. In a statement to the media, Microsoft said it wants “to be transparent" about its collection and use of voice data to "ensure customers can make informed choices".

The best rated online legal forms

  1. Go to the voice section of Microsoft's Privacy dashboard.
  2. Sign in to your account.
  3. From the My Activity tab on the Privacy page, scroll down to Voice activity.
  4. Click View and Clear Voice activity.
  5. Filter your data types on the right, and then review your data activity.
  6. To delete voice data, hit the Clear button.
  7. Next, go to the Cortana's Notebook tab on the Privacy page.
  8. Review your activity and hit the Clear button to delete your Cortana data.
Writing by Maggie Tillman. Originally published on 8 August 2019.