Encrypted messaging could increase child abuse cases, report warns

Millions of children in England are using messaging platforms that they are not old enough to be accessing and the introduction of end-to-end encryption (E2EE) could increase their risk of exploitation, the Children’s Commissioner for England has warned in a report.

 The Commissioner’s report follows announcements by Facebook – and indications by other social platforms, such as Snap – that they plan to apply E2EE to all their messaging services.

The Commissioner said that E2EE makes it impossible for the platform itself to read the contents of messages and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse.

In January 2020, the NSPCC said that the number of child abuse cases had risen to 90 per day and called for regulators to tackle the issue.

The report includes a survey revealing the extent of children’s use of messaging services, including by children much younger than the minimum age requirement.

Nine out of ten children aged between 8-17 were found to be using messenger services, with 60 per cent of 8-year-olds and 90 per cent of 12-year-olds using a messaging app with an age restriction of 13 or older. Almost one in ten children report using a messaging service to talk to people they don’t already know.

The report warns that the privacy of direct messaging platforms can conceal some of the most serious crimes against children, including grooming, exploitation and the sharing of child sexual abuse material.

An NSPCC investigation found that Facebook, Instagram and WhatsApp were used in child abuse images and online child sexual offences an average of 11 times a day in 2019.

It also found that the rate of grooming offences committed in the UK appears to have further accelerated over the course of lockdown, with 1,220 offences recorded in just the first three months of national lockdown.

Anne Longfield, the Children’s Commissioner for England called on the government to introduce online harms legislation to Parliament in 2021. The legislation should set a strong expectation on platforms to age verify their users and allow for strong sanctions against companies which breach their duty of care, she said.

Longfield further recommended the inclusion of GDPR-style fines and a requirement to issue notifications to users when tech firms are found to be in breach of their duty of care.

“This report reveals the extent to which online messaging is a part of the daily lives of the vast majority of children from the age of 8. It shows how vigilant parents need to be, but also how the tech giants are failing to regulate themselves and so are failing to keep children safe,” Longfield said.

“The widespread use of end-to-end encryption could put more children at risk of grooming and exploitation and hamper the efforts of those who want to keep children safe.

“It has now been 18 months since the Government published its Online Harms White Paper and yet little has happened since, while the threat to children’s safety increases.

“It’s time for the Government to show it hasn’t lost its nerve and that it is prepared to stand up to the powerful internet giants, who are such a big part in our children’s lives. Ministers can show they mean business by promising to introduce legislation in 2021 and getting on with the job of protecting children from online harms.”