Microsoft employees sue company over forcing them to watch murder and child porn

Two Microsoft employees are filing suit against the company for having them look at photos and videos “designed to entertain some of the most twisted and sick minded people in the world.” Well the suit is actually for the PTSD developed from the years of screening Microsoft users’ communications for child pornography and evidence of other crimes.

The two employees, Henry Soto and Greg Blauert, alleged that as part of Microsoft’s online safety team, their job was to figure out what customer content should be taken down and what should be reported to police. The two men and the rest of the team were given access to all Microsoft user online accounts, and apparently Redmond did not give the employees any psychological support, often telling them to simply go for a walk, take a smoke break, or play video games to clear their heads (what Microsoft called a Wellness Program).

One of the employees, Henry Soto, said that he was “involuntarily transferred” to the team in 2008, and couldn’t change departments again until at least 18 months. He stated that he “was not informed prior to the transfer as to the full nature” of the job, which would require him to view photos and video showing “horrible brutality, murder, indescribable sexual assaults, videos of humans dying and, in general, videos and photographs designed to entertain the most twisted and sick-minded people in the world.”

Greg Blauert has been suffering from “acute and debilitating PTSD” since his breakdown in 2013.

Both men tried to improve the program with feedback, but said that they received no response from Microsoft. After doctors recommended medical leave for both men, they both applied for worker’s comp, but were denied. However, as the illnesses were sustained because of the scope of the job, the denials are likely to be overturned, or paid out in the lawsuit.

When asked about the suit, Microsoft responded:

“Microsoft applies industry-leading, cutting-edge technology to help detect and classify illegal images of child abuse and exploitation that are shared by users on Microsoft Services,” a Microsoft spokesperson wrote in an email. “Once verified by a specially trained employee, the company removes the image, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the images from our services. We have put in place robust wellness programs to ensure the employees who handle this material have the resources and support they need.”

Some links in the article may not be viewable as you are using an AdBlocker. Please add us to your whitelist to enable the website to function properly.

Source Related
Comments