First Xbox content moderation transparency report highlights improving proactive enforcements

Reading time icon 3 min. read


Readers help support MSPoweruser. When you make a purchase using links on our site, we may earn an affiliate commission. Tooltip Icon

Read the affiliate disclosure page to find out how can you help MSPoweruser effortlessly and without spending any money. Read more

To show its dedication to “foster spaces that are safe, positive, inclusive, and inviting for all players,” Microsoft released its first Xbox Transparency Report, which issued 4.33 million proactive enforcements against inauthentic accounts in the Xbox community. The number accounts for 814,000 accounts for adult sexual content, 759,000 for harassment or bullying, and 1 million accounts for profanity.

“Publishing this report is part of our long-standing commitment to online safety, addressing the learnings and doing more to help people understand how to play a positive role in the Xbox community,” CVP Xbox Player Services Dave McCarthy explains in an Xbox Wire post. “This report compliments the continued review of our Community Standards, making it easier for all players to understand what is and isn’t acceptable conduct; continued investment in moderation tools; and ongoing partnership with industry associations, regulators, and the community.”

The actions given to the more than 4 million inauthentic accounts between January and June constituted 57% of the company’s total enforcements. In an interview with Bloomberg, McCarthy said that not only the reinforcements addressed automated or bot-created accounts but so as “regular activity by nation-state actors and other funded groups attempting to distribute content that has no place on our services.”

These proactive reinforcements, the CVP explained, were performed by artificial intelligence or human moderators. While the latter’s size was not specified, McCarthy stressed that the agents are “on-staff 24 hours a day, 7 days a week, 365 days a year” and that Xbox’s proactive moderation increased up to 9X compared to the same period last year. McCarthy also praised the participation of the players in the community in the accomplishment, who provided 33 million reports during the period.

The boost in proactive moderation of Xbox can also be attributed to Microsoft’s 2021 acquisition of content moderation platform Two Hat, which uses an AI-powered content moderation process that classifies and filters messages, usernames, images, and videos. Despite this technology, Xbox still has other resource areas to tap, specifically in utilizing voice-filtering tech to identify community violations on its platform. McCarthy told Bloomberg that it might focus on the said area in the future while giving importance to players’ privacy.

Xbox promises to release a moderation transparency report every six months. And given that this is its first and the company is only one of the few gaming platforms exploring this new practice, McCarthy said Xbox is “learning our way into what a good transparency report looks like for us.”

More about the topics: content moderation, gaming, xbox

Leave a Reply

Your email address will not be published. Required fields are marked *