Meta, the Facebook's parent company said messaging
encryption on the apps would now come in 2023.
The process means only the sender and receiver can read
messages, but law enforcement or Meta cannot.
However, child protection groups and politicians have warned
that it could hamper police investigating child abuse.
The National Society for the Prevention of Cruelty to
Children (NSPCC), has claimed that private messaging "is the front line of
child sexual abuse".
UK Home Secretary Priti Patel has also criticised the
technology, saying earlier this year that it could "severely hamper"
law enforcement in pursuing criminal activity, including online child abuse.
Privacy vs Protection
End-to-end encryption works by "scrambling" or
encrypting the data while it travels between phones and other devices.
The only way to read the message is usually to get physical
access to an unlocked device that sent or received it.
The technology is the default for popular messaging service
WhatsApp, also owned by Meta - but not the company's other apps.
The NSPCC sent Freedom of Information requests to 46 police
forces across England, Wales, and Scotland asking them for a breakdown of the
platforms used to commit sexual offences against children last year.
The responses revealed:
- more than 9,470 instances of child sex abuse images and online child sex offences were reported to police
- 52% of these took place on Facebook-owned apps
- over a third of the cases took place on Instagram, and 13% on Facebook and Messenger, with very few occurring via WhatsApp
That has led to fears that Meta's plans to expand encryption
to widely-used Facebook Messenger and Instagram direct messages could shield
the majority of abusers from detection.
The NSPCC said that encrypting messages by default could
lead to the easier spread of child abuse imagery or online grooming.
But advocates say that encryption protects users' privacy,
and prevents prying by both governments and unscrupulous hackers. Meta chief executive
Mark Zuckerberg made those arguments himself when he announced Facebook's
encryption plans in 2019.
'Getting it right'
Antigone Davis, Meta's global head of safety, said that the
delay in implementing encryption to 2023 was because the company was taking its
time "to get this right".
The company had previously said the change would happen in
2022 at the earliest.
Ms Davis said: "As a company that connects billions of
people around the world and has built industry-leading technology, we're
determined to protect people's private communications and keep people safe
online."
She also outlined a number of additional preventative
measures the company had already put in place, including:
- "proactive detection technology" that scans for suspicious patterns of activity such as a user who repeatedly sets up new profiles, or messages a large number of people they do not know
- placing under-18 users into private or "friends only" accounts by default, and restricting adults from messaging them if they aren't already connected
- educating young people with in-app tips on how to avoid unwanted interactions
Andy Burrows, head of child safety online policy at the
NSPCC, welcomed the delay by Meta.
He said: "They should only go ahead with these measures
when they can demonstrate they have the technology in place that will ensure
children will be at no greater risk of abuse.
"More than 18 months after an NSPCC-led a global
coalition of 130 child protection organisations raised the alarm over the
danger of end-to-end encryption, Facebook must now show they are serious about
the child safety risks and not just playing for time while they weather
difficult headlines."