Read the affiliate disclosure page to find out how can you help MSPoweruser effortlessly and without spending any money. Read more
Google and YouTube are under fire once more this week as a BBC investigation revealed that across YouTube, videos of minors and children were found plagued with lewd comments from predatory commenters and strangers.
According to the BBC:”The comments are shocking. Some of them are extremely sexually explicit. Others include the phone numbers of adults or requests for videos to fulfil sexual fetishes. They were left on YouTube videos posted by young children and they are exactly the kind of material that should be immediately removed under YouTube’s own rules – and in many cases reported to the authorities.” YouTube was slow to respond to reports, with one video in particular being cited, removing only 5 of the comments until the BBC reached out to them weeks later, after which 23 comments were responded.
Another investigation by the New York Times that several channels uploaded content of what appeared to on the surface be child-friendly material, but turned out to be violent, psychologically harmful material. Content creators would post videos of what looked like child-friendly characters like Dora the Explorer or Peppa Pig, and then show them performing harmful or disturbing actions. Google’s algorithm would then surface the videos in Google’s “child-friendly” YouTube Kids app, springing them on unsuspecting children.
The original New York Times article leads with this little anecdote.
The 10-minute clip, “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized,” was a nightmarish imitation of an animated series in which a boy and a pack of rescue dogs protect their community from troubles like runaway kittens and rock slides. In the video Isaac watched, some characters died and one walked off a roof after being hypnotized by a likeness of a doll possessed by a demon.
YouTube has since taken action on the matter, with the Google-owned company taking a tougher stance in a blog post this week. Google has actively removed a number of channels which post material that contributes to child endangerment, removed ads from videos which target families and children with content that is not audience appropriate and blocked inappropriate comments on videos featuring minors.
“Across the board, we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies. These latest enforcement changes will take shape over the weeks and months ahead as we work to tackle this evolving challenge,” YouTube’s Johanna Wright explained in said blog post this past Wednesday. “We’re wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right. As a parent and as a leader in this organization, I’m determined that we do.”
Despite that, it hasn’t been enough to convince advertisers and concerned watchers. The Guardian reports that companies ranging from Mars to Adidas have pulled their ads from the service, not just on offending videos but on all content in general, hitting the search giants where it hurts.
A Mars spokesperson said in a comment to the news company: “We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content. We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally. Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
Another for the supermarket Lidl stated: “It is completely unacceptable that this content is available to view and it is, therefore, clear that the strict policies which Google has assured us were in place to tackle offensive content are ineffective. We have suspended all of our YouTube advertising with immediate effect.”
Google — as one of the primary portals of internet access often finds itself at the centre of these debates around freedom of expression with relation to the protection of children and more vulnerable people. The firm has often shirked responsibility for this, in a manner similar to Facebook and Twitter, claiming that it cannot police the content posted on their platforms. With its public image and bottom line under fire, the firm no longer has the luxury of painlessly presenting itself as a faux neutral party in this debate.