Apple accused of underreporting child sexual abuse material on its platforms

iOS 18 Photos
(Image credit: Apple)

Apple has been accused of underreporting instances of child sexual abuse material being found on its platforms, according to a leading government child protection body. 

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) says data gathered by freedom of information requests implicates Apple in hundreds of child sexual abuse material (CSAM) incidents in England and Wales, more than the company officially reported globally in a year. 

That’s according to a new report from The Guardian which states “The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products.” The report claims that in just one year “child predators used Apple’s iCloud, iMessage, and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC.” 

Apple blasted over CSAM failures 

Specifically, the reported data purportedly reveals that “Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales.” That’s a number higher than Apple’s 267 reported instances of suspected CSAM on its platforms worldwide, as lodged with the National Center for Missing & Exploited Children (NCMEC). By way of contrast, Google reported 1.4 million instances and Meta 30.6 million in 2023. 

The NSPCC’s head of child safety online policy Richard Collard said there was “a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple’s services and the almost negligible number of global reports of abuse content they make to authorities.” He stated that the Cupertino company was “clearly behind” many of its peers in tackling the proliferation of CSAM content. 

It’s not the first time Apple has been accused of failing to stop child sexual abuse material, with a similar accusation leveled by authorities in Australia in 2022.  Apple tried to introduce controversial new protections against CSAM in 2021, using technology that would scan the hashes of photos uploaded to iCloud against a database of known CSAM content. However, the move was met with staunch pushback from privacy advocates and Apple quietly scrapped the move before the end of the year

Preventing the spread of CSAM remains a prescient issue, and the EU is currently preparing to adopt legislation that would see “all your digital messages” scanned on iPhones and beyond, even to the detriment of end-to-end encryption. 

More from iMore

Stephen Warwick
News Editor

Stephen Warwick has written about Apple for five years at iMore and previously elsewhere. He covers all of iMore's latest breaking news regarding all of Apple's products and services, both hardware and software. Stephen has interviewed industry experts in a range of fields including finance, litigation, security, and more. He also specializes in curating and reviewing audio hardware and has experience beyond journalism in sound engineering, production, and design. Before becoming a writer Stephen studied Ancient History at University and also worked at Apple for more than two years. Stephen is also a host on the iMore show, a weekly podcast recorded live that discusses the latest in breaking Apple news, as well as featuring fun trivia about all things Apple. Follow him on Twitter @stephenwarwick9

  • Just_Me_D
    To me, this is akin to blaming a vehicle rental company for renting a van to drug dealers who transported drugs to distribute in various neighborhoods of which a typically good kid tried some and ended of overdosing.

    Someone is always going to find a way to use a good solution to do bad things.

    Blaming Apple or any other entity for ‘underreporting’ will NOT in any way, shape or form, make the problem of CSAM go away. People always blame or attempt to put pressure on everyone except the perpetrators.
    Reply
  • FFR
    Just_Me_D said:
    To me, this is akin to blaming a vehicle rental company for renting a van to drug dealers who transported drugs to distribute in various neighborhoods of which a typically good kid tried some and ended of overdosing.

    Someone is always going to find a way to use a good solution to do bad things.

    Blaming Apple or any other entity for ‘underreporting’ will NOT in any way, shape or form, make the problem of CSAM go away. People always blames or attempt to put pressure on everyone except the perpetrators.

    Just like gimping AirTags for everyone, just because some people would use it for stalking purposes.

    How about don’t gimp it and put those stalkers in jail instead.
    Reply