Ex-Facebook employee says child safety team in disarray due to End to End Encryption plans

Reading time icon 3 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

We reported recently that Facebook/Meta has postponed its plans for End to End Encryption (E2EE) on Messenger and Instagram until 2023, as the company works to resolve child safety issues.

Many critics have noted that WhatsApp already uses E2EE and felt the delays were unjustified.

Today David Theil, previous Facebook employee in their Child Safety division and current Chief Technology Officer of the Stanford Internet Observatory, posted a thread on Twitter berating both Facebook and their critics for the push towards E2EE on the service.

He noted that Facebook’s motives to push for E2EE were far from pure, and had more to do with “preempting anti-trust action, less interaction with LE (law enforcement), significantly scaled back safety teams, and good marketing.

In short, by hiding communication on Facebook from Facebook itself, the company would be less responsible for the content it transmits.

Theil said the plan was announced without any roadmap to implement it, and with “an absurdly accelerated timeline,” before Facebook had even decided how to integrate their 3 messaging platforms (WhatsApp, Messenger and Instagram).

Worse was that tests showed that when Facebook did not inspect the content of messages (as would be the case with E2EE), they were only able to detect “child grooming, sextortion and CSAM distribution” 10% as well as if they did actually look at the content of the messages, meaning the majority of harm would escape detection.

With no clear plan to address this, and a “gotta break a few eggs” attitude from management, Theil says many prominent members of the child safety teams resigned.

Theil said the only way to maintain E2EE and still safeguard children would be client-side inspection, as Apple recently implemented, but, as Apple discovered that this was anathema to most security researchers and people in general.

Theil notes that WhatsApp did of course use E2EE already, but due to the 1:1 nature of the network, it did not provide ready access to children by predators, unlike Facebook, which was designed to introduce people to new friends and expand their social network.

Theil explained:

WhatsApp doesn’t recommend people to befriend and interact with. It doesn’t host secret groups of unlimited size. It doesn’t provide global search of every user. It doesn’t group people by location or institutions like high schools. Whereas Facebook tries to take existing social networks, merge them and build new ones. This has led to wildly inappropriate situations (including literally recommending victims to abusers) particularly when combined with contact sync and offsite pixel tracking.

Even then a lot of abuse was still missed on the platform.

Theil noted that until social networks themselves could be made inherently safe from child predators, E2EE should not be added to the system.

Read his full thread here.

More about the topics: Child Safety, facebook, Meta

Leave a Reply

Your email address will not be published. Required fields are marked *