Facebook is reportedly assigning a trust score to users who report news on the site. This is so the firm can weed out users who report credible news simply because they disagree with it on ideological reasons or who are just trolling.
The story was originally reported as a centralised reputations score for Facebook earlier in the day before the firm pushed back on reports.
“The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in the Washington Post is misleading. What we’re actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system,” a Facebook spokesperson wrote told Gizmodo via email, “The reason we do this is to make sure that our fight against misinformation is as effective as possible.”
Facebook explained that if someone gave feedback on an article which was false that was later confirmed to be false by credible fact checkers, they would be more likely to weight that person’s feedback compared to someone who pointed the fake news button at any reporting they didn’t like indiscriminately.
Facebook has come under fire for enabling the spread of fake news before, so measures like this to prevent abuse of the platform are only expected.