Microsoft has always advocated for the ethical use of AI technology, while on the other hand selling the self-same technology for dubious use to governments, arguing that regulation should stem from the democratic process rather than ethical concerns of employees and shareholders.
Every once in awhile the hypocrisy of this position is exposed, and on this occasion, it is with accusations that a Microsoft-funded start-up, AnyVision, is using AI facial recognition technology to track Palestinians working in Israel around the West Bank.
According to an NBC report Anyvision uses a network of thousands of cameras around the West Bank to track the movement of Palestinians using a system called “Google Ayosh”, where “Ayosh” refers to the occupied Palestinian territories and “Google” denotes the technology’s ability to search for people as if they are items on Google search.
AnyVision won a top defence prize in Israel in 2018 and was lauded by Israel’s defence minister for preventing “hundreds of terror attacks” using “large amounts of data.”
AnyVision only admits to using their biometric technology for border crossing control and claimed the company was the “most ethical company known to man.”
A document on Israel Government website, however, lists the features of AnyVision Better Tomorrow as:
- Face recognition In the wild: Enrollment of individuals is not required. All individuals passing through the field of view are detected automatically without having to be stopped or interrupted. The system requires only 45×45 pixels to Detect & Recognize faces.
- One-to-Many: means one sensor (Fixed CCTV, PTZ, Mobile device and others) that extracts and recognizes all the faces in the field of view (vs. common technology which allows for one individual per one sensor).
- Utilizing the existing CCTV: using existing (on-premise) camera infrastructure, not requiring any special cameras or infrastructure upgrades.
- Mass scale: Designed for real-time mass crowd detection and recognition, with detection time of 0.2 seconds per database of up to 300 million individuals.
- Ethnic in-variance: Our Neural nets are trained to detect and recognize faces of various ethnicities.
- Face obstructions: Invariant to disguises, sunglasses, facial hair, hats, makeup etc.
- Accuracy and reliability: Accuracy level of up to 99.78% true positive rate and less than 1 false positive alarm per 25,000
Microsoft has responded to the report by hiring former U.S. Attorney General Eric Holder to investigate whether the use of facial recognition technology such as AnyVision Better Tomorrow complies with its ethical principles.
Microsoft told NBC “Microsoft takes these mass surveillance allegations seriously because they would violate our facial recognition principles.”
According to these principles, Microsoft would “advocate for safeguards for people’s democratic freedoms in law enforcement surveillance scenarios and will not deploy facial recognition technology in scenarios that we believe will put these freedoms at risk.”
“If we discover any violation of our principles, we will end our relationship.”
On the wider front, Microsoft backed a U.S. Senate bill that would require a court order before federal law enforcement could use the technology for targeted, ongoing surveillance. Neema Singh Guliani, senior legislative counsel for the American Civil Liberties Union, however, said the bill “falls woefully short of protecting people’s privacy rights.”
Microsoft currently sells similar technology to AnyVision however but directed to workplace surveillance, showing the double-edged nature of the technology.
There have been recent reports of facial recognition technology being used in China for what has been called automated racism, where discriminative practices and restrictions are being applied to the Uyghurs population there using facial recognition technology, facilitated by work by US AI academics, on occasion working for Microsoft.
Do our readers think Microsoft, who topped Forbes’s list of most ethical companies, are acting according to their stated principles? Let us know below.