Personal data to train AI: 9 experts in privacy can't tell if Microsoft is using your personal data to train its AI models

Reading time icon 4 min. read

Readers help support MSPoweruser. When you make a purchase using links on our site, we may earn an affiliate commission. Tooltip Icon

Read the affiliate disclosure page to find out how can you help MSPoweruser effortlessly and without spending any money. Read more

microsoft using personal data to train AI

Is Microsoft using personal data to train AI? Mozilla definitely thinks so. The company believes the Redmond-based tech giant is up to something involving the usage of your personal data to train AI models. If this is proven to be true, then following 30 September, which is the date the new Microsoft Services Agreement goes into effect, Microsoft could use your personal data to train its AI models, including Bing Chat, Windows Copilot, and every other AI tool it might develop in the future.

I said ‘could’, because Mozilla doesn’t really know exactly how Microsoft will do it, but it will most probably do it. According to their latest blog post, where they encourage people to sign a petition asking Microsoft to be transparent about the usage of personal data, the company claims that it brought together 9 experts in privacy, including 4 lawyers, to look at Microsoft’s new Service Agreement, and none of them could tell what Microsoft is going to do – a sign that the new Agreement is hard to understand even for experts. Is Microsoft up to something?

We had four lawyers, three privacy experts, and two campaigners look at Microsoft’s new Service Agreement, which will go into effect on 30 September, and none of our experts could tell if Microsoft plans on using your personal data – including audio, video, chat, and attachments from 130 products, including Office, Skype, Teams, and Xbox – to train its AI models.

The petition asks for transparency from Microsoft, as well as an interdiction to use personal data to train AI. According to an email sent to users who signed the petition, Microsoft could use every audio and video calls that you make on Teams and Skype to train its AI, thus compromising your privacy.

Imagine a world where every private conversation you have on a Microsoft service could become fodder to train AI. The audio and video from every Teams video chat. Every Skype call. Every attachment in your Microsoft email products, every message sent over Xbox chat, and a whole lot more. It’s a world where our private conversations are used in a way that we’ve never even considered – let alone consented to.

Personal data to train AI: here’s how Microsoft sees it

Microsoft will indeed collect and use your personal data to train AI, and the Redmond-based tech giant says it’s doing so to improve your overall experience with its AI apps. However, you don’t really know what to make of it, as Microsoft is quite vague.

Microsoft uses the data we collect to provide you with rich, interactive experiences. In particular, we use data to:

  • Provide our products, which includes updating, securing, and troubleshooting, as well as providing support. It also includes sharing data, when it is required to provide the service or carry out the transactions you request.
  • Improve and develop our products.
  • Personalize our products and make recommendations.
  • Advertise and market to you, which includes sending promotional communications, targeting advertising, and presenting you with relevant offers.

We also use the data to operate our business, which includes analyzing our performance, meeting our legal obligations, developing our workforce, and doing research.

Our processing of personal data for these purposes includes both automated and manual (human) methods of processing. Our automated methods often are related to and supported by our manual methods. For example, to build, train, and improve the accuracy of our automated methods of processing (including artificial intelligence or AI), we manually review some of the predictions and inferences produced by the automated methods against the underlying data from which the predictions and inferences were made. For instance, with your permission and for the purpose of improving our speech recognition technologies, we manually review short snippets of voice data that we have taken steps to de-identify. This manual review may be conducted by Microsoft employees or vendors who are working on Microsoft’s behalf.

So, is Microsoft using personal data to train AI? Well, AI will indeed use your voice as a method of training to recognize you, but Microsoft promises it will only use short snippets.

If you’re not sure about it, you can go through the whole Microsoft Services Agreement, but you should know that it’s a long read, and you won’t be able to fully understand some parts, as Mozilla stated.

What’s your opinion on this? Should we let Microsoft use our personal data to improve its apps? Should Microsoft be at least more transparent when it comes to it? Let us know your thoughts.

More about the topics: ai, microsoft, personal data