A hot potato: During WWDC 2021 in June, Apple unveiled its upcoming device operating systems. It made a bid deal about the expanded privacy features of iOS, iPadOS, and macOS Monterey. What it didn't elaborate on is its expanded protections for children, and for good reason. At face value, Apple's protections for children run contrary to its tough stance on user privacy.

In the most recent iOS 15 preview, Apple rolled out some features that have many privacy advocates, including the Electronic Frontier Foundation (EFF) crying "backdoor." The features are part of Apple's effort to crack down on Child Sexual Abuse Material (CSAM).

The first feature uses machine learning to look for potentially sensitive images within the Messages app of children under 12. If inappropriate material is received, the picture is blurred, and a notification tells them it is okay not to view the photo along with links to "helpful resources." The child is also informed that if they do open the image, their parents will be notified. It also works if the child attempts to send an explicit photo. They will receive a warning that if they send the image, their parents will receive a notification.

Apple says that all AI processing is done on the device to protect users' privacy, and nothing is ever uploaded to Apple servers. It will work on all Apple device operating systems.

The second is called CSAM detection. CSAM refers to content that depicts sexually explicit activities involving a child. A database of known images from the National Center for Missing and Exploited Children (NCMEC) is downloaded and stored on the device as hash values. Before a user uploads a photo iCloud, the AI will compare hash values. If the image has enough hits, the content will be manually validated then sent to NCMEC, which handle any legal actions.

While nobody would argue against keeping children safer, it seems to be Apple's approach that is raising hackles. The EFF feels that the new features introduce the opportunity for countries to pressure Apple to scan for other content that has been deemed illegal.

"If these features work as described and only as described, there's almost no cause for concern. But the 'if' is the rub."---John Gruber

"That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change," the EFF said. "At the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

Others have contrary opinions saying that this could be better for security. Tech blogger John Gruber, the inventor of the Markdown markup language, wrote in Daring Fireball:

"In short, if these features work as described and only as described, there's almost no cause for concern. But the 'if' in 'if these features work as described and only as described' is the rub. That 'if' is the whole ballgame. If you discard alarmism from critics of this initiative who clearly do not understand how the features work, you're still left with completely legitimate concerns from trustworthy experts about how the features could be abused or misused in the future."

Gruber speculates that Apple may see this as an initial step in implementing end-to-end encryption in iCloud.

It also raises Fourth Amendment issues. Is scanning a device regardless of how securely it is done violating the Fourth Amendment's protections against warrantless search and seizure? It would seem that the technology and the way it is being implemented is something like a backdoor by proxy for law enforcement to search a phone without probable cause.

Apple critics are sure to lambaste the company and its platform for this move, while many fans will take the stance that it's welcome since they have no CSAM on their devices anyway and would like to see their children protected.

Regardless of how you look at it, this is undoubtedly a controversial issue that the community will hotly debate in the weeks leading up to the fall release of its operating systems. Before jumping to one side of the fence or the other, you should read Apple's explanation as well as the several relevant FAQs and technical documents posted to its website.