Apple criticized for lack of effort in reporting child sexual abuse on its platforms

Another recent report also reveals a surge in X-rated deepfakes

Reading time icon 2 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Key notes

  • The NSPCC criticized Apple for underreporting child sexual abuse material (CSAM) on its platforms.
  • Data shows Apple reported fewer CSAM cases to the NCMEC in 2023 compared to other tech giants like Google and Meta.
  • Despite plans for a CSAM detection tool, Apple abandoned it in 2022 due to privacy concerns.
Apple building

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) has criticized Apple for inadequately reporting child sexual abuse material (CSAM) on its platforms.

An exclusive report by the Guardian reveals that data from the NSPCC reveals that Apple’s iCloud, iMessage, and Facetime were involved in more cases of CSAM in England and Wales alone than the company reported globally to the National Center for Missing & Exploited Children (NCMEC) in 2023.

Despite being required to report CSAM, Apple has made significantly fewer reports compared to other tech giants like Google and Meta.

“Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the rollout of the Online Safety Act in the UK,” says the organization’s spokesperson.

A July 2024 report by the Internet Watch Foundation, a UK-based organization, reveals a surge in AI-generated child sexual abuse material (CSAM), with over 3,500 new images and the emergence of realistic AI-generated videos, including deepfakes. There is also a rise in such imagery on the clear web and the use of known victims’ and famous children’s likenesses

Apple had previously planned to implement a photo-scanning tool to detect CSAM on iCloud Photos but abandoned it back in 2022 due to privacy concerns.

“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” the iPhone makers said in a statement to Wired at that time.

The company’s latest Apple Intelligence offering, despite being launched this year for its mobile devices, is not yet to come to Europe. The decision was made over the Digital Markets Act’s interoperability requirements, which Apple says could compromise user privacy and security. The company plans to work with the EU to resolve these issues.