Microsoft has been cosying up more and more with governments around the world, seeking out the steady and often overpriced military contracts which are essential for their transformation as a service company.
While wearing the patriotic flag has largely made Microsoft immune from criticism in the USA, the same can not be said about their engagement in China, well known as a repressive regime increasingly enabled by technology.
Microsoft is currently at the receiving end of allegations that they are assisting the Chinese military in research related to facial recognition and AI, work which is expected to get an early application in surveilling 11 million Uighurs on China’s western frontier, with more than 1 million already imprisoned in detention camps.
The allegations relate to Microsoft Research Asia working with the Chinese military-run National University of Defence Technology to co-write three papers between March and November last year, including on facial recognition.
Sen. Marco Rubio described Microsoft’s partnership with the Chinese military as “deeply disturbing” and “an act that makes them complicit” in China’s human rights abuses.
Sen. Ted Cruz said “American companies must recognise this threat and rethink their role in aiding China.”
Microsoft has defended the work, saying:
“Microsoft’s researchers, who are often academics, conduct fundamental research with leading scholars and experts from around the world to advance our understanding of technology. In each case, the research is guided by our principles, fully complies with US and local laws, and the research is published to ensure transparency so that everyone can benefit from our work.”
The work brings to mind IBM’s role in information processing during the Holocaust in Germany, with many saying the killing of millions would not have been possible without the efficiency brought by computer assistance.
It is also a reminder that there is a difference between ethical and legal, which is more often more clear when it involves enemies rather than friends.