Apple’s Encryption Sent to the Crypt

Apple’s Encryption Sent to the Crypt

By: Quan Nguyen

October 15, 2021


For those who cherish their private life online, they won’t like what Apple is scheming. This Big Tech company plans to upgrade security for its users by limiting material sent via text or uploaded to iCloud. As expected, security comes at the expense of privacy.

Apple announced its new technology on August 5, 2021; the software allows them to scan their devices’ messages and photos. They intend to increase protections for children by searching for Child Sexual Abuse Material and penalizing offenders who harbor CSAM. Instead of praising Apple’s drive for a safer internet, users have criticized their decision.

Many value Apple products for their end-to-end encryption, meaning only the sender and the recipient can view a text message. In this regard, Apple made way for private and trustworthy communication, aside from the iOS 12.1 update bug that revealed devices were recording before picking up calls. Their upcoming privacy policy similarly disregards user privacy by essentially putting an end to end-to-end encryption. 

The update to Apple’s privacy and child safety policies allows for client-side scanning of user devices, which includes searching for CSAM in texts to and from minors, and in photos that get uploaded to iCloud. iCloud does not use end-to-end encryption for backups, so Apple can distribute those files. Upon scanning, images are hashed, or given a string of numbers to represent the image, for comparison to CSAM.

According to their announcement, Apple claims the purpose of the update is to protect children from predators. In carrying out that plan, they will scan photos as they are uploaded to iCloud, as well as ones sent and received via text, and compare them to a known CSAM database, run by the National Center for Missing & Exploited Children.

More specifically, children under 13 sending sexually explicit images will receive a warning, and continuing with sending such images will notify the parent account. Even more concerning, these photos are saved on the parent’s phone and cannot be deleted from the child’s device. Users between the ages of 13 and 17 receive a similar warning without notification to their parents if they proceed. Sexually explicit images are blurred if a child’s account receives them and parents will be notified should the child choose to view the image. These functions of client-side scanning require users that communicate with minors to exercise elevated caution to retain privacy. 

The Center For Democracy & Technology also points out that notifying parents of children’s activity can threaten the wellbeing of minors. Specifically, they note that children in the LGBTQ+ community with strict parents are at risk. Many LGBTQ+ members fear their parent’s reactions when disclosing their sexual orientation, and Apple will not make it easy on them.

Client-side scanning will happen on-device, using machine learning software to compare potential CSAM to hashes in the NCMEC database. This means that the database will be uploaded to all devices. Apple claims they will never see any photos unless they are flagged as CSAM, in which case the photo undergoes manual review and disables accounts in violation. As a result of manual reviews, less than one in one trillion false positives due to system error or cyberattacks would occur each year. Still, many users have concerns that scanning leaves a backdoor into messaging and photo systems.

Backdoors in technology allow developers to access computer systems and encrypted data at a later point in time, sometimes defeating the purpose of end-to-end encryption. Parties of a transaction are awarded an end-to-end encryption key, which prevents developers from being the middleman of data sharing. These keys are like physical keys that only the sender and the other users have — only these parties can “unlock” the encrypted content. However, iCloud backups store a copy of that key, meaning Apple can retrieve the copied key since iCloud backups are not end-to-end encrypted. This issue alone creates a backdoor that allows Apple to access data. iCloud is the newest culprit of diminishing privacy.

On June 23, 2020, a bill was introduced to congress that required a backdoor to iPhones and prohibited strong encryption for tech giants. The bill did not receive a vote, but that doesn’t mean it could not have passed. 

Even further back, the United States Federal Bureau of Investigations requested a backdoor into iPhones following an act of terrorism in San Bernardino, California in 2016. At the time, Apple said it was something “too dangerous to create.” The implementation of such a program would lead to malicious use where sensitive data could be leaked. On paper, Apple is now keeping their stance by saying they would refuse government demands to append non-CSAM images to the hash-list; what they potentially could do is a different story.

Since Apple’s initial announcement, they have received a massive amount of backlash from the community. The Electronic Frontier Foundation, a digital rights group, received over 27,000 signatures to petition against the surveillance update, and a coalition of over 90 organizations signed an open letter to Apple opposing the update. Apple decided to postpone the update to revise their plan over the next few months instead of releasing it alongside the iOS 15 update earlier this month. Evidently, the community’s voice has derailed Apple’s plan, and privacy can be protected so long as it’s fought for. 

The privacy concerns raised by this topic are obvious, although Apple opposes any accusations of introducing a backdoor. “It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. If the technology to spy on users exists, it will be exploited. 

Despite Apple’s plan to refuse to scan non-CSAM, all that is needed to widen the backdoor is to adjust and expand software parameters so that scans are not limited to sexually explicit content or content from minors. Apple themselves admits that “it is certainly possible to create an entirely new operating system to undermine our security features as the government wants.”

If this all doesn’t sound invasive enough, the updates will also involve Siri and Search to help detect CSAM. These systems will address CSAM-related queries if users initiate one. All of these additions impose some technology that seems to actively monitor younger users and weather the trust of the brand. 

At the end of the day, Apple walks a fine line between privacy and security. They overlooked the backlash they would receive and need to arrive at a sensible solution that satisfies users. Of course, the community will continue to express skepticism as people’s digital life expands in value. Unlike Apple, people cannot pretend there will not be malpractice of these features.


Quan Nguyen is a senior studying professional and public writing. He also currently works as a technical writer and aims to continue with that career path. When not writing, he plays guitar, plays video games, longboards or messes around with tech stuff.