Privacy & Security

Apple’s Plan to Scan for Child Porn Leaves You Defenseless

  • author Mark Nestmann
  • calendar September 7, 2021

We’re proponents of encryption. When you properly encrypt your data, only you and those you intend to share it with listen to your voice calls or read your electronic messages. You can even encrypt your entire data stream to protect all records of your online life from prying eyes.

But encryption has many enemies. We’ve repeatedly warned about Uncle Sam’s efforts to force companies that manufacture communications devices or build networks for them, to insert “back doors” in them. (This term refers to a method used to bypass normal security measures to access a computer system, network, or software application.)

Back doors are a terrible idea because strong encryption is the only certain way to protect sensitive databases. And of course, there’s a very real prospect that hackers might discover the back door. That’s happened on numerous occasions in the past.

For instance, when encryption tools first became accessible to individuals in the 1990s, police and intelligence agencies worried about “going dark” – not being able to monitor the encrypted communications of criminals and terrorists. The Clinton administration responded with a proposal for an electronic circuit called the “Clipper Chip.” With Clipper, however, the government would hold a key that could be used to unlock encrypted conversations – the back door.

Essentially, the government wanted tech companies to break their own encryption methods and share knowledge of those intentional flaws with law enforcement. Fortunately, Congress refused to go along with the scheme after a researcher discovered the actual back door in the Clipper design. It would allow anyone with the knowledge of the compromised algorithm to listen in.

But law enforcement agencies have kept trying to convince Congress to mandate back doors. These requests tend to coincide with events in which criminals are said to be using “unbreakable” encryption.

One example was in the wake of a 2015 attack in Paris where a squad of Islamic militants massacred 130 people in restaurants, theaters, and other locations. The attackers supposedly used encryption to communicate. Former CIA director Mike Morrell was one of many who suggested this theory, although it was later found to be baseless. And right on cue, then FBI Director James Comey proposed forcing US phone companies, internet service providers, and social media companies to insert back doors in their products.

At the same time, federal agencies have been using the courts to make companies break their encryption protocols. For instance, in 2020, the FBI went to court to force Apple to help it unlock two iPhones purportedly used by a Saudi Arabian pilot who murdered three people in a 2019 attack at the naval air station in Pensacola, Florida. To its credit, Apple fought back in court.

But Apple is tired of fighting. Last month, the company announced it would introduce tools in forthcoming software updates for US customers that will automatically scan your iPhone and iPad for child porn. Images on your device to be uploaded to iCloud will be analyzed to determine if any of them match “known images of child pornography.” Positive scans will be reviewed by humans, and confirmed positive results forwarded to the National Center for Missing and Exploited Children (NCMEC). In turn, the NCMEC will presumably turn over offending photos to the FBI.

And if Apple mistakenly flags an image as constituting child pornography? Well, don’t worry, the company says. It claims there’s “less than a one in one trillion chance per year of incorrectly flagging a given account.” And besides, if a photo of your naked baby taking a bath is wrongly pegged as child porn, there’ll be an appeals process in place to contest it. Assuming, of course, you’re not already in prison.

We’d estimate the probability of wrongful tagging is a lot more than one in a trillion. Machine scanning for sexual content is almost laughably inaccurate. When Facebook tried to restrict its members from posting nude images, pictures of famous statues like Neptune and the Little Mermaid were removed. Blogging platform Tumblr managed to remove “offensive” images of puppies and fully-clothed people when it tried to restrict sexual content. Perhaps Apple has built a better mousetrap, but we suspect there will be plenty of false positive images reported to the NCMEC.

What else could go wrong? Well first, these tools are in themselves a back door. Your photos will be scanned before they’re even encrypted. While Apple claims that iMessages are “end-to-end-encrypted,” once these tools are in place, they won’t be.

Oh, and get this … to ensure your phone is capable of even scanning images for child porn, a version of the NCMEC Child Sexual Abuse Material (CSAM) database will be installed on your iPhone or iPad. You presumably won’t be able to see them, but courtesy of Apple, your device will henceforth contain millions of child pornography images.

To be sure, you can exempt yourself from these scans. Simply toggle the “Disable iCloud Photos” switch in the settings menu. But what happens a few years down the road, when a politician points out the fact that shockingly, child pornographers are toggling the switch? Congress then passes a law forcing Apple to disable this feature and analyze photos that aren’t backed up to iCloud.

Then there’s the inconvenient fact that since Apple now has the technology to review content for child porn, screening could be widened to look for other types of content. Indeed, other companies are already using such tools to prevent “terrorist” content from being uploaded to social networking platforms. Any photos or content that triggers pre-set parameters could also be targeted. As NSA whistleblower Edward Snowden warned in his blog, “this is not a slippery slope. It’s a cliff.”

The offline equivalent of this type of search would be for the police to conduct constant warrantless searches of your home in the expectation they might find something illegal. And to keep copies of whatever they find—forever.

Unfortunately, we can’t think of a positive twist to this announcement. If you don’t want Apple to screen your photos, you can of course disable photo uploads to iCloud, but that act could make you a suspect in a child porn investigation. The government could even force Apple to build an app to secretly disable this switch.

In response to the vociferous criticism by civil liberties activists, Apple announced last Friday that it would delay the rollout of these tools. The company promised to “make improvements” in its scanning system “before releasing these critically important child safety features.”

But no matter what improvements Apple makes, once its scanning system is deployed, it will represent an open invitation for authoritarian governments to carry out mass surveillance systems to spy on citizens.

Our suggestions are simple. Ditch your iPhone and iPad. While you’re at it, unsubscribe from all social networks, at least those you used your real name to sign up for. Don’t upload any photos of children (including your own) to the internet unless they’re fully encrypted. Finally, use the encryption tools we mentioned in this article to protect your data.

A good time to begin securing your electronic life would be today. Neither Apple nor the US government are going to do it for you.

On another note, many clients first get to know us by accessing some of our well-researched courses and reports on important topics that affect you.

Like How to Go Offshore in 2021, for example. It tells the story of John and Kathy, a couple we helped from the heartland of America. You’ll learn how we helped them go offshore and protect their nestegg from ambulance chasers, government fiat and the decline of the US Dollar… and access a whole new world of opportunities not available in the US. Simply click the button below to register for this free program.

About The Author

Free Course

This new report shows you how to go offshore this year and protect your money from ambulance chasers, government fiat and the decline of the US Dollar.

Get our latest strategies delivered straight to your inbox for free.

Get Our Best Plan B Strategies Right to Your Inbox.

The Nestmann Group does not sell, rent or otherwise share your private details with third parties. Learn more about our privacy policy here.

The Basics of Offshore Freedom

Read these if you’re mostly or very new to the idea of going offshore

Threats & Opportunities 2021

What issues are set to change your life for the better (or worse)?

How to Go Offshore in 2021

[CASE STUDY] How we helped two close-to-retirement clients protect their nest egg.

Nestmann’s Notes

Our weekly free letter that shows you how to take back control.