Apple recently announced a new set of features aimed at combatting Child Sexual Abuse Materials (CSAM), which include the ability to scan a user’s phone and iMessages. Since the announcement, the company has reiterated the numerous safeguards that they developed, but privacy advocates have bemoaned the potential for abuse and “mission creep.”

The issue of CSAM proliferation has become an epidemic. The National Center for Missing and Exploited Children (NCMEC) is the U.S. agency responsible for tracking CSAM, has reported an exponential increase in the number of images and videos that it has received growing from 600,000 a decade ago to over 70 million in 2019. Facebook has been one of the few high profile ESPs to scan each image and video for CSAM, but as their former Chief Security Officer Alex Stamos explained in a recent Stanford Internet Observatory (SIO) video, the typical paradigm for CSAM detection is when content is shared (whether through a messaging app, or as a shared album). Apple’s proposed system works locally on your phone if the feature is enabled, and raises the question of who actually owns your computing device if a tech company can start snooping on images and video that are ostensibly private. Apple has stated that the feature will roll out in the U.S. first, and will only be deployed in other countries after further examination. They have also unequivocally stated that the technology will only be used for CSAM, and not to satisfy another country’s demands (e.g. identifying “terrorists” or political dissidents). For photographers, the potential breach of privacy is concerning. Photographers of all ilk – from photojournalists to landscape photographers – have legitimate reasons for ensuring that content isn’t seen by anyone else (human or machine) until they choose to disseminate or publish their images. The notion of Canon, Sony or Nikon running content scans on your camera is horrifying, and not an unfair analogy.

Stanford Internet Observatory Research Scholar, Rianna Pfefferkorn, made the point in a recent round table discussion that technology can’t fix the underlying sociological, historical, and poverty-fueled issues that lead to the conditions (particularly in South East Asia) where child rape and abuse can take place and be recorded. That said, most experts agree that without a coordinated and concerted effort on the part of ESPs, the proliferation of CSAM will continue unabated. Apple’s solution might catalyze a more coherent industry response, or it might devolve into a slippery slope of a terrible ethical conundrum with a tragic human toll. We mention the following photographers, articles, and websites in this episode of Vision Slightly Blurred.

Apple will scan iPhones for child pornography(via Washington Post)A Criminal Underworld of Child Abuse, Part 1(via The Daily)A Criminal Underworld of Child Abuse, Part 2(via The Daily)How Facial Recognition Is Fighting Child Sex Trafficking(via Wired)Will Cathcart, head of WhatsApp, tweeted his strong disagreementwith AppleApple Addresses CSAM Detection Concerns, Will Consider Expanding System on Per-Country Basis(via MacRumors)Apple’s Plan to “Think Different” About Encryption Opens a Backdoor to Your Private Life(via EFF)Security Researchers Express Alarm Over Apple’s Plans to Scan iCloud Images, But Practice Already Widespread(via MacRumors)

About the Author

Allen Murabayashi is a graduate of Yale University, the Chairman and co-founder of PhotoShelter blog, and a co-host of the “Vision Slightly Blurred” podcast on iTunes. For more of his work, check out his website and follow him on Twitter. This article was also published here and shared with permission.