Chris Hansen has done his part in keeping children safe from predators- now it’s Microsoft’s turn.
Project Artemis is an automated system, developed by Microsoft, which can sniff out sexual predators that lurk in various online chat rooms- including those of video games. Once the tool detects typical patterns of communication used by predators, a “risk score” is allocated to the conversation, so it can be flagged to a content reviewer, who can then relay it to law enforcement.
Microsoft first collaborated with children’s game Roblox, as well as messaging app Kik and the Meet Group- creators of dating and friendship apps Skout, MeetMee and Lovoo at a Microsoft hackathon for child safety, back in November 2018.
This isn’t Microsoft’s first automated system of this sort- the company first introduced the idea back in 2015, to combat grooming on Xbox Live. Project Artemis builds on this, by looking for patterns of keywords and phrases that are commonly associated with grooming- including sexual interactions and even manipulation tactics.
The tool will not only detect those who are a threat to children, but also those who are actively exploiting children. In the event of an imminent threat like such, the National Center for Missing and Exploited Children will be contacted.
While Courtney Gregoire, Microsoft’s chief digital safety officer, says that Artemis is a “significant step forward”, it’s still “by no means a panacea.”
“Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems”, “but we are not deterred by the complexity and intricacy of such issues.”
Microsoft has already started testing the Project Artemis on Xbox Live and the chat feature of Skype; and from January 10th, it will be licensed for free to other companies through nonprofit Thorn, which builds tools to prevent the sexual exploitation of children.