Apple Scans Photos Uploaded to iCloud to Check Possible Cases of Child Abuse

Apple’s chief privacy officer, Jane Horvath, revealed during the CES 2020 event that the Cupertino-based tech giant scans images backed up on iCloud from its devices such as the iPhone and the iPad to check for possible cases of child abuse.

Without going too deep into the image scanning methods, Jane said, “We are utilizing some technologies to help screen for child sexual abuse material.” Most other tech companies like Google, Facebook, and Twitter use a system called PhotoDNA, which can compare an image with a set of pictures that have been identified to contain child sexual abuse material to screen illegal photos. So, Apple may be using the same PhotoDNA system to scan users’ images to recognize pictures with child sexual abuse.

According to Jane, removing encryption isn’t how Apple is scanning the images, reassuring the brand’s customers that the privacy of their data on Apple’s servers remains intact. She added, “End to end encryption is critically important to the services we come to rely on…. health data, payment data. Phones are relatively small they get lost and stolen. We need to make sure that if you misplace that device you’re not (exposing that data).” Although Apple didn’t reveal from how long has it been scanning user’s images, it sure isn’t doing it in secrecy, as a disclaimer on Apple’s website states:

“Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. As part of this commitment, Apple uses image-matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”

Our Take

As each day passes by, government agencies are pressurizing tech companies more and more into allowing them to scan user’s data for potential threats. While brands like Apple have been successful so far at refusing to reveal user’s data, it is hard to say for how long they can keep doing so.

If you are really concerned about keeping your data private, the only way to ensure that is to keep it stored in offline storage media like pen drive or a Blu-ray disc.

Do you approve of Apple’s decision to scan users’ images for child sexual abuse material? Do let us know in the comments below.