A CONCERNING 94% of the 69 million child sex abuse images picked up by US tech firms last year were found on Facebook.
This figure has been highlighted again as seven countries, including the UK and the US, released a statement warning of the negative impacts of end-to-end encryption.
Encryption means no one apart from the sender or the recipient can modify or see the message unless those two people show someone else.
This is already used on WhatsApp, which is owned by Facebook, but the firm has plans to bring it to Instagram Direct messages and the FB Messenger app.
The aim is to protect user privacy but law enforcers are worried it could hamper their attempts to protect children and target paedophiles.
The 69 million images of children being sexually abused were reported to the US National Centre for Missing and Exploited Children (NCMEC) in 2019.
Officials said the most images and the worst category of images came from Facebook.
There are concerns that the number of illegal images reported could drop to zero if end-to-end encryption comes into place, according to the National Crime Agency (NCA).
According to Sky News, Robert Jones, the NCA director responsible for tackling child sexual abuse, said: “The lights go out, the door gets slammed, and we lose all of that insight. It is as simple as that.
“And nothing, you know we’re relying on the best technical expertise… in the UK, the same people that keep the UK safe against terrorists, hostile states, cyber attacks, are telling us there is no viable alternative. I believe them. And I am deeply concerned.”
The NCA says there’s at least 300,000 people who are a sexual threat risk to children in the UK.
Industry reports of online images in the past year led to thousands of arrests and are thought to have safeguarded 6,000 children.
Mr Jones added: “The end-to-end encryption model that’s being proposed takes out of the game one of the most successful ways for us to identify leads, and that layers on more complexity to our investigations, our digital media, our digital forensics, our profiling of individuals and our live intelligence leads, which allow us to identify victims and safeguard them.
“What we risk losing with these changes is the content, which gives us the intelligence leads to pursue those offenders and rescue those children.”
The UK government has asked Facebook to reconsider its end-to-end encryption plans, along with officials from the US, Australia, New Zealand, Canada, India and Japan.
They’re calling for more public safety measures and for law enforcement and governments to be able to access content.
A Facebook company spokesperson said: “We’ve long argued that end-to-end encryption is necessary to protect people’s most private information.
“In all of these countries, people prefer end-to-end encrypted messaging on various apps because it keeps their messages safe from hackers, criminals, and foreign interference.
“Facebook has led the industry in developing new ways to prevent, detect, and respond to abuse while maintaining high security and we will continue to do so.”
Facebook’s biggest cyber-security mistakes
Here’s some of the major times Facebook let us down…
- In 2007, Facebook’s first targetted advertising product, Beacon, caused outrage because there was initally no opt-in option about the kinds of information users wanted to share
- In 2009, a Federal Trade Commission investigation was triggered because Facebook users complained that the new privacy tools were too confusing and pushed users to make more of their personal information public
- In 2010, it was revealed that advertisers were using a privacy loophole to retrieve revealing personal information about Facebook users and the company had to change its software
- In 2011, the FTC charged Facebook with lying to customers about how their information could be kept private but making it public anyway
- 2018 saw Facebook’s biggest privacy scandal to date with reports that Cambridge Analytica misused user data and Facebook had to admit that it had failed to protect its users