NEWS10 August 2021

Apple counters privacy concerns over child safety features

News North America Privacy Technology Youth

US – Apple has defended its development of two tools used to identify child sexual abuse material on customers’ phones following criticism of the tools’ privacy implications.

Apple logo on building

In a question-and-answer document published this week, Apple set out how the two tools would work and sought to counter suggestions the tools could act as a ‘backdoor’ to spy on people, particularly in authoritarian nations.

Communication safety in Messages is designed to analyse images on-device in the Messages app, with any deemed to be sexually explicit blurred, with warnings and information provided to the child as well as reassurance that it is okay if they do not want to view or send the photo.

Apple said that the tool does not change the privacy assurances of Messages, and that the company does not gain access to communications as a result of the feature.

The second tool, CSAM detection in iCloud Photos, provides information to Apple on any images in iCloud Photos that match known child sexual abuse material.

Apple said that the tool does not apply to any other on-device data, nor photos stored outside iCloud Photos.

The document from Apple says that the company would also refuse any requests by governments to broaden the scope of the tools beyond child sexual abuse material, and that the tools would have a one in a trillion chance of incorrectly flagging an account.

“We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” the document says.

“Since we announced these features, many stakeholders including privacy organisations and child safety organisations have expressed their support of this new solution, and some have reached out with questions.

“This document serves to address these questions and provide more clarity and transparency in the process.”

A blog posted by India McKinney and Erica Portnoy on the website of digital privacy charity Electronic Frontier Foundation said that the plans still amounted to a “backdoor” that could be used by governments to search devices for other means.

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the blog said.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts.

“That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”

Will Cathcart, head of WhatsApp at Facebook, also criticised Apple’s proposals on Twitter.

“I read the information Apple put out yesterday and I'm concerned,” Cathcart wrote.

“I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we'll adopt this system for WhatsApp. The answer is no.”

@RESEARCH LIVE

0 Comments