Apple software chief admits child protection measures have been 'widely misunderstood'
What you need to know
- Apple announced a series of controversial child protection measures last week.
- The company's head of software, Craig Federighi, has admitted the measures have been "widely misunderstood".
- He told The Wall Street Journal Apple wished the measures had come out "a little more clearly."
Apple's head of software, Craig Federighi, has told The Wall Street Journal's Joanna Stern that the company's new Child Safety measures have been "widely misunderstood.
In an interview, Federighi said that Apple wished the measures had come out a little more clearly, following a wave of controversy and adverse reaction to the measures. Federighi told Joanna Stern that "in hindsight", announcing its new CSAM detection system and a new Communication Safety feature for detecting sexually explicit photos at the same time was "a recipe for this kind of confusion."
Federighi says that "it's really clear a lot of messages got jumbled pretty badly" in the wake of the announcement.
On the idea that Apple was scanning people's phones for images, Federighi said "this is not what is happening." He said, "to be clear we're not actually looking for child pornography on iPhones... what we're doing is finding illegal images of child pornography stored in iCloud". Noting how other cloud providers scan photos in the cloud to detect such images, Federighi said that Apple wanted to be able to detect this without looking at people's photos, doing it in a way that is much more private than anything that has been done before.
Federighi stated that "a multi-part algorithm" that performs a degree of analysis on-device so that a degree of analysis can be done in the cloud relating to detecting child pornography. Federighi did in fact state the threshold of images is "something on the order of 30 known child pornographic images," and that only when this threshold is crossed does Apple know anything about your account and those images and not any other images. He also reiterated Apple isn't looking for photos of your child in the bath, or pornography of any other sort.
Pressed about the on-device nature of the feature Federighi said it was a "profound misunderstanding", and that CSAM scanning was only being applied as part of a process of storing something in the cloud, not processing that was running over images stored on your phone.
On why now, Federighi said that Apple had finally "figured it out" and had wanted to deploy a solution to the issue for some time, rather than suggestions Apple was caving to pressure from somewhere else. Federighi also said CSAM scanning was in "no way a back door" and that he didn't understand that characterization. He noted that if someone was scanning for images in the cloud that no one could know what they were looking for, but that this was an on-device database that shipped to all devices in all countries regardless of location. And that if people weren't confident that Apple would say no to a government, he assured people that there were multiple levels of audibility such that Apple couldn't get away with trying to scan for something else.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
Federighi also explained Communication Safety in Messages, which uses machine learning to detect sexually explicit images in photos received by children. He also clarified the tech was "100% different" from CSAM scanning. He also said Apple was confident in the tech, but that it could mistakes, although Apple had a hard time coming up with images to fool the system.
Apple's new measures have raised ire with some privacy advocates and security experts.
Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design. Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9