Microsoft and Harvard announce an open-source differential privacy platform
2 min. read
Published on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Microsoft has collaborated with Harvard University’s Institute for Quantitative Social Science to create a first-of-its-kind open-source platform for differential privacy. The technology will allow researchers to preserve user privacy while performing analysis on datasets.
Both Microsoft and Harvard first began the development of a differential privacy platform last year and have finally announced it in collaboration with OpenDP Initiative led by Harvard.
Differential privacy, the heart of today’s landmark milestone, was invented at Microsoft Research a mere 15 years ago. In the life cycle of transformative research, the field is still young. I am excited to see what this platform will make possible.
– Cynthia Dwork, Gordon McKay professor of CS at Harvard and Distinguished Scientist at Microsoft
This will be available as a royalty-free license under Microsoft’s own differential privacy patents to the world. Differential privacy introduces noise to a dataset that hides the identity of the individuals and protects their privacy while keeping the data accurate. It utilizes the following to work:
- A small amount of statistical “noise” is added to each result to mask the contribution of individual data points. This noise works to protect the privacy of an individual while not significantly impacting the accuracy of the answers extracted by analysts and researchers.
- The amount of information revealed from each query is calculated and deducted from an overall privacy budget to halt additional queries when personal privacy may be compromised.
Microsoft’s differential platform and its algorithms are now available on GitHub for developers, researchers, academics and companies worldwide to use for testing, building and support.
User forum
0 messages