Google is finally trying to actually protect kids with YouTube Kids after scandal

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

youtube

Last year, we reported that advertisers were running away from YouTube due to a deluge of poorly censored content trickling its way to the child-friendly version of YouTube, to the horror of busy parents everywhere.

We noted then that:

Another investigation by the New York Times that several channels uploaded content of what appeared to on the surface be child-friendly material, but turned out to be violent, psychologically harmful material. Content creators would post videos of what looked like child-friendly characters like Dora the Explorer or Peppa Pig, and then show them performing harmful or disturbing actions. Google’s algorithm would then surface the videos in Google’s “child-friendly” YouTube Kids app, springing them on unsuspecting children.

Google seems to now be moving towards a common sense approach and will involve human curators when selecting what videos to show.

Google will be getting rid of the algorithm, and channels will be allowed to upload content on the app on a case by case basis, that is to say, Google will be working from a whitelist, rather than try to fight the internet flood. Presumably, any unseemly content will result in being kicked off the whitelist or otherwise censured for impropriety.

This is so parents (and advertisers) can feel comfortable, knowing YouTube will be safe for children (and ads targeted to children by coincidence we’re sure.)

While the firm didn’t deny the report, Youtube told Buzzfeed News in a comment that “We are always working to update and improve YouTube Kids, however, we don’t comment on rumor or speculation.”

User forum

0 messages