Google Search to punish websites that publish explicit AI-generated deepfakes

Google will demote sites that publish such things

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • Google is updating Search to better hide explicit deepfakes and remove similar content from search results.
  • Sites with frequent deepfake removals will have lower search rankings.
  • The changes also aim to distinguish between real and fake explicit images.
Google Search illustration

Google is improving systems to help remove non-consensual explicit AI-generated content from Search.

The Mountain View tech giant recently announced that it’s improving the system to help remove such content, making the removal process easier. That means, Google will also start punishing websites that publish such content by improving Search’s ranking systems to lower the visibility of such content—such as Taylor Swift’s deepfakes that circulated earlier.

“For queries that are specifically seeking this content and include people’s names, we’ll aim to surface high-quality, non-explicit content — like relevant news articles — when it’s available,” Google explains.

“These protections have already proven to be successful in addressing other types of non-consensual imagery, and we’ve now built the same capabilities for fake explicit images as well,” Google adds in the announcement, saying that it’ll filter out similar explicit results and remove duplicates, too.

The aim is to reduce the visibility of deepfakes by over 70% this year, Google says, and it’s also making sure that real explicit content isn’t unfairly affected—such as consensual nude scenes in movies from legitimate, real-life actors.

It’s been a challenging few years for AI tech makers to address such issues. Internet Watch Foundation (IWF), a UK-based anti-child porn organization, reported a surge in deepfake porn in recent months. It became a problem that Microsoft urged Congress to enact a deepfake fraud statute to establish a specific legal framework.

Leave a Reply

Your email address will not be published. Required fields are marked *