Apple’s New CSAM Detection Policy Analysis

Several times a year, there seem to be current events or topics that strike a chord both inside and beyond the digital forensic community. We’ve discussed these in previous articles with regard to the Carpenter v. US decision and Apple’s previous spat with the FBI in the wake of the San Bernardino, CA terrorist attack.

As no stranger to these current event discussions (i.e., controversy) when it comes to matters of privacy and cooperation with Law Enforcement, last week we had another “bombshell” dropped by Apple that in a new US-based update, they will be subjecting user’s on-device photos to hash analysis attempting to track down images of known child sex abuse material (CSAM) that may be uploaded to iCloud and forwarding this information for follow-up to the National Center for Missing & Exploited Children (NCMEC) or other law enforcement investigative entity. Here, we’ll discuss how this works and explore both sides of the issue.

Copyright © 2020 Investigators Toolbox. All rights reserved

New Report