Facebook has come under fire for its opaque moderation procedures this past year, with the New York Times posting a piece about its disorganised content strategy this week. The firm responded in an unattributed post, claiming that its moderation policies were not only clear, but clearly communicated.
In its blog post, the firm claimed:
We make changes to our policies based on new trends that our reviewers see, feedback from inside and outside the company, as well as unexpected, and sometimes dramatic, changes on the ground. And we publish the changes we make every month.
This stands in contrast to the New York Times report which identifies Facebook moderators as stressed, often tasked with applying often contradictory rules from a guidebook which spans 200 pages in just seconds.
“Moderators express frustration at rules they say don’t always make sense and sometimes require them to leave up posts they fear could lead to violence. “You feel like you killed someone by not acting,” one said, speaking on the condition of anonymity because he had signed a nondisclosure agreement,” The New York Times reported on Thursday.
Facebook also claimed:
We hire reviewers for their language expertise and cultural context — we review content in over 50 languages — and we encourage them to take the time they need to review reports. They work in more than 20 sites around the world, which resemble Facebook’s own offices, and they provide 24/7 support. As the Times notes, some reviewers are based in Morocco and the Philippines, while others are based in the United States, Germany, Latvia, Spain and other locations around the world.
But the New York Times reports that the bulk of Facebook moderators are drawn from third party companies. Facebook cannot possibly know about their “expertise” and would be relying on third-party companies to vet them properly.
“We play an important role in how people communicate, and with that comes an expectation that we’ll constantly identify ways we can do better. That’s how it should be,” Facebook’s unattributed post goes on to conclude, “And it’s why we constantly work with experts around the world to listen to their ideas and criticism and make changes where they’re warranted. Throughout 2018, we’ve introduced more transparency into our policies and provided data on how we enforce them. We’ve got more in store in 2019, and we look forward to people’s feedback.”
Facebook’s defence of itself appears to make a lot of sense. Of course, it would be more believable if the firm were confident enough to put a name to the words.