Making Sense Of Apple Protections For Children

Here is something I send out to my mailing list. And wanted to share my thoughts here as well :slight_smile:

Apple announced that it will start scanning images of photos uploaded too iCloud in an effort to identify child pornography. This has set off a firestorm of objections and no small amount of misinformation. The situation is still unfolding but here is my take based on a close examination of all the publicly available information:

** First, here is Apple’s announcement:

Personally, I applaud the fact that Apple has taken a strong and sincere stand on the the problem of child pornography. But the devil of course lies in the details, and like any complex issue there are pros and cons to Apple’s proposal, which I detail below.

** How the Apple system will work:

Apple is adding CSAM (Child Sexual Abuse Material) (Child Sexual Abuse Material) protection into the next version of iOS, the iPhone’s operating system in order to fight the distribution of child pornography. The CSAM detection software will only examine photos stored on your phone before they are uploaded into iCloud.

** A user can effectively opt out of the hash scanning by turning off use of iCloud.

The software also will not actually “look” at the actual images. Instead, it will examine a mathematical code “hash” of the content of the photos and compare it with the hashed data that is stored in a database containing known CSAM images of child pornography. Getting a bit more technical, both the pictures on the database and pictures in iCloud are interpreted as a “hash”, which is a long line of code that uniquely identifies an image down to a single pixel without knowing its content.

Apple’s CSAM software will compare the hashes of pictures on your iPhone with the hashes of known pornographic images. More specifically, when you take or import new images the software will create hashes and compare these to the database of known CSAM images. If there is no match between the hash of a new image and the CSAM database nothing happens. If a match is detected, an alert will be sent and the relevant image will be manually reviewed by someone at Apple. It will likely be disabled and reported to the NCMEC (National Center for Missing and Exploited Children), which in turn may report it to law enforcement.

** Unfortunately it will have almost no impact on the creation and spread of these despicable images.

The problem is that as long as the image hash isn’t in the database it will not be recognized. It will only apply to existing known content that a user has to store into the photos app that is enabled to use iCloud. New self created images will not be detected!

Frankly I think this entire system is to protect Apple from being liable for distributing CSAM images on via their servers.

Apple is not alone when it comes to scanning for hashes of offending images; Google, Facebook, Microsoft and others are doing the same but only after the images have been uploaded to their server. Note that Apple’s CSAM software is scanning the hashes of your pictures locally on your iPhone before the images go into the cloud. More on this rather confusing distinction further below.

** What are the risks of Apple’s proposal:

Risk of false positives:

Hashes are not always absolute, and false hits (called “collisions” ) can happen, but instances of collisions are extremely rare. In fact, the only instances of actual collisions to date have been found only in lab proof of concept experiments set up to test the potential for collisions.

In the event of a spurious collision, it (hopefully) will be caught and corrected when the image is referred for human inspection at Apple. The risk of a collision is thus vanishingly small, and the risk of collision slipping past a human inspector undetected is even smaller. The possibility of an error getting past both Apple’s software and their human inspectors is thus so vanishingly small that it is hard to come up with an adequate metaphor, but our non-technical take is that it would be comparable to getting attacked by a great white shark while flying in a commercial airliner at 35,000 ft at the very moment the plane is also struck by an incoming meteor.

** The risk of surveillance creep:

A bigger concern is surveillance creep. The news is full of abuse of software (like Pegasus) intended for limited law enforcement use. It is thus worrisome that Apple’s CSAM software could be abused by governments seeking to hash-match images unrelated to child pornography in order to identify and silence dissidents, journalist, etc.

It is clear that something needs to be done to cut off the flow of child pornography. I thus agree with the general principle of scanning hashes to look for offending images. What makes me uneasy is Apple’s specific approach of scanning files on our phones whether or not the images are uploaded to iCloud. Apple states that that they will not scan images of users who opt out of using iCloud, but the technical ability exists to do so and thus there is real potential for abuse by Apple, law enforcement authorities, foreign agents – or criminals.

** And then there is the slippery slope:

What happens with pictures today could happen with conversations tomorrow or any one of the other sensor inputs the the iPhone maintains? Mobile phone location data is already being used in law enforcement investigations to track down and question people who’s phones were in the vicinity of a crime that you might not be even aware off.

The success of existing detection schemes will create a strong temptation for law enforcement to expand what it wants to inspect. Will voice mails be scanned? Will caller and address book metadata be analyzed?

Will smart phone mics always be listening for indications of domestic violence and call police? Or will smartphones be unwillingly turned into a “shot-spotter” network, with microphones listening for gunshots and cameras watching for muzzle flashes?

This all borders on paranoid improbability, but recent history is replete with examples of theoretical risks becoming actual problems. This is why the metaphor of the slippery slope is so compelling – and so vulnerable. As Cicero observed centuries ago, events once underway “… glide readily down the path of ruin once they have taken a start.” Or more simply, once one steps upon the slippery slope, one inevitably ends up at the bottom.

** Two more new initiatives that are under-reported!

The big advantage of the scanning of images on the iPhone is used in a way that will have a much bigger impact in the protection of children.


Apple is adding image monitoring to iMessage. The idea is that when a child is about to send or receive sexually explicit images, iMessage will first flash up a warning message on their device and if “family sharing” is switched on, the alert will also be sent to the child’s parents. Also, offending images will be blurred and the software will display an additional warning if the child attempts to view the un-blurred image. If the child opens the image the parent(s) will also receive an additional notification.

** This technology will be much more effective to protect children!

This feature will be implemented as an opt-in-only feature performed locally on the affected iPhone. There will be no uploads to the cloud, and only family members specified in the Family Plan settings will receive alerts. Law enforcement will not be involved in any way and parents will have full control. I believe that this feature will be of far greater benefit to watchful parents than the CSAM image monitoring, especially as both settings and notification are left in the full control of parents.

Siri & Search

Siri and and search are also being updated to intervene when users search for queries related to child pornography. Apple hasn’t provided the details of how it will work, but whatever the final configuration, there remains the possibility (and to law enforcement, the temptation) of feature creep. And once the feature exists, there is the risk that certain state actors will demand Apple extend the feature to detect what others considered protected political speech. For example, it is easy to imagine a foreign government demanding a feature extension as a condition of selling Apple products in their country.

** Looking ahead:

It is still early in this process and much remains to be learned about Apple’s plans. Whatever one thinks of the details, Apple is clearly motivated by a sincere desire to counter what it sees as critical risks in the rapidly evolving digital world. But even good intentions and a sincere desire to help are not without risk.

** It this a self-inflicted Catch-22 for Apple?

Failure to stem the spread of child pornography on it’s platforms goes against Apple’s long-standing commitment to providing a safe, family-friendly walled garden for it’s customers. However, adding additional protections like those contemplated in its recent announcement opens Apple to accusations that it is lowering level of its customer promise of strong privacy protections.

** This is a slippery slope!

Apple is the role model for protection of privacy of its users. Lowering the bar even for this Nobel cause will lower the industry standard. Something that already leaves much to be desired. Looking at Google, Microsoft and Facebook for starters…

** As users, we are inevitably caught in the middle

As Apple customers, we have little say in what Apple does. Our only choice we is to use iCloud or not. And also to watch out that we don’t somehow innocently touch a topic that might later be added to the list of things to scan for and report back to Apple and the authorities.

** My bottom line…

Don’t get me wrong – I love apple iCloud service. It’s the glue that ties our Mac’s, iPhone and iPad together and synchronizes essential data. On balance, I much prefer living Apple’s walled garden approach over the wild west of Android and Windows. And I believe the same applies to my clients: more then once I have saved a client’s digital life simply by enabling iCloud services and restoring lost/corrupted data after a Mac was stolen or stopped working.

However, I have adopted a posture of constant vigilance regarding the future. Given the intrusion of ever more capable Artificial Intelligence functions intruding into our phones and computers, prudence dictates that we are prepared to take more dramatic steps in the event that current trends mature into a serious invasion of our privacy.

I thus am constantly evaluating other alternatives. So far, the available alternatives come at considerable cost in terms of reduced functionality and increased technical hassle compared to Apple’s offerings. But in a worst case scenario, trading convenience for privacy, might be a price worth paying.

Finally, as this post suggests, as deeply as I have explored the current state of this issue, I am only beginning what I am certain will be a long and complex journey of discovery.

If you are interested in learning more as I deepen my understanding, let me know by signing up for my newsletter dedicated to this topic.


Yeah I’m out as well.

I’m eventually going to cancel Apple One because iCloud was a key component of my reasoning for subscribing. I’ll probably ditch Apple Music. I’m a Tidal subscriber as well and frankly I get get Tidal support on many more devices. . I’ll watch the current Apple TV shows and then I’ll ditch that.

I’m not going to give Apple 240 dollars a year to push bs down my throat. You’re not doing anything for kids …you’re a lifestyle computer company. Stop with the delusions of grandeur.

In the next 18 months I’m going to fortify my network. When I believe in Apple’s directions I support it through Apple hardware and services …when I don’t believe in their direction I pull my support …at this point it’s going to be from a services level.

1 Like

I “hear” you but that are you going to use from here on forward?

Among the people I know, Apple is completely losing its reputation as the “privacy company” right now because of the on-device scans. You just have to think about what this could look like in a few years, when such routines are looking for other violations as well. I don’t want to live in a world where my own devices spend all day collecting evidence against me. But that’s exactly what Apple is laying the groundwork for right now.


Indeed they’ve pretty much crumpled up their “privacy” advantage and have thrown it out of the window. My mother is an attorney so I’m biased in favor of affecting change through legal frameworks.

Scanning photos does “nothing” to remedy the situation. I’m still fuming over how quickly the news cycle moved on from digging into the “Lolita Express” or border break ups of families or water quality in Flint Mi which affects children.

I’m not an overly political guy but I hate when people use “but the children” shields for pushing encroachment on privacy rights when they don’t have a track record of “boots on the ground” activism on the topic.

I’m not too hot headed on this one. I’ll see how things play out but Apple’s questionable ethics are (Siri monitoring without approval, battery issues, quasi privacy) are beginning to become harder to defend.

rosemary definitely does not want you talking about this. she will magically appear and close this thread like the others in 3 - 2 - 1…


I was wondering why these threads were closed as well. I know this is a private forum but it would still be good to have some transparency. Are there any rules banning controversial or political discussions?

Those other discussions got completely out of hand and were getting personal. I think my OP is a more balanced explanation with the pro’s and con’s.

Remember that our opinions are influenced by the people we interact with.

What stuns me is how litte people care about online privacy and security. And I am not talking about tinfoil hat go dark privacy practices but just basics. Most users don’t care at all.

For example this post was a copy from my newsletter that goes to a mailing list of over 500 people. Mostly clients of mine. So far this email has one of the lowest opening rates and click troughs in 10 years of Mac tips and tricks newsletters.

Just a few reactions from those who are already aware of this. But thats it.

As I mentioned in my OP its that I still like Apple and iCloud services BUT with an increasing amount of caution.

I strive to own all my data in a way that I access it locally and practice using alternatives so I am ready to take it all off line and move to system that I have more control over. This decision will come at a price of convenience.

For example I just reorganized all my 1Password vaults for my work and private and made a shared vault with my wife. This is working great and I definitely benefit from this system. However their recent changes as discussed elsewhere in the forum make me regret going all inn. However the convenience of the shared vaults with my wife and clients (share vaults regarding iT stuff in their businesses) still outweigh my concerns.

That being said I am prepared to switch to a more private system at the cost of convenience once the balance tips over.


I love the idea of mixing and matching services but honestly Apple’s iCloud makes things so easy.

That being said this comfort is why companies and Governments believe they can implement what I consider more control than I’m comfortable with.

Today we’re at the opposite end of the Paradox of Choice

. We don’t have enough options. If I decide I want to leave iCloud it means I’m also cancelling a lot of subscriptions where iCloud is the only option.

I agree with @MacExpert if ask someone directly they’ll claim that they care about privacy but the reality is they care far more about comfort and security.

I’m ok if a vendor like Apple says “I don’t wish my services to be used for illegal activity so steps have been taken ensure proper compliance”. I should have the option not to use their services (this is a bit of a sticky issue currently)

I think we’re going to see a little bit of a migration off the cloud. Local storage is often more reliable and you control your pricing. Synology is getting a bit of a attn with DSM 7 and the Photo app. Even before this CSAM stuff I’ve been wanting to start moving my photos from mobile devices and cloud infrastructure to local storage with offsite backup. I’m actively looking for a good photo management and editing solution that works well with a library on a NAS.

Quite honestly I don’t think Apple’s software is all that great anymore. My Final Cut Pro will be joined by Davinci Resolve soon. Logic Pro may be joined by Luna or Presonus One.

I love Mac OS…it’s a nice framework for developers to do good things with and that’s where my attention may go in the near future.

1 Like

Yep I am already utilizing my Synology. It offers iCloud like capabilities.
I am not even taking it all the way to the internet, only accessing it locally or via my own VPN connection.
Eliminating the need for opening various ports certificates etc etc.
Just a Wireguard vpn via my Untangled router that is keeping a close eye on my network.

This. This is why people don’t care. Where are they going to go that’s as simple as using iCloud? At least on Apple devices. There’s no where to go. :confused: I’ve personally migrated everything off of iCloud onto my Synology and have switched to using Mylio with encrypted backups on OneDrive. But it was a pain to get setup. Your average person just won’t bother with it. Which is a shame, because it really is a nice service.


Speaking of Untangle …I’ve been vacillating between Pfsense on a Netgate appliance or Untangle for my first Firewall. I’m leaning Untangle because I want to be up and running easier with the UI.

I, like most people on this forum was concerned by this announcement by Apple. The underlying reason, children safety obviously resonated with me considering I recently became a parent.

However, after viewing Craig’s interview, I feel better. Obviously, your mileage may vary.


Hey have you posted your thoughts about Mylio anywhere? I’d love to read your experiences if you have or eventually get around to it.

The setup on my router was very easy. Next I generated the QR code the setup the vpn profile using the Wireguard app. The connection is very stable and fast.
For example I take the dog for a walk and instantly see the security camera’s that are running on my Synology. Just as fast as on my lan. Despite the relatively poor T-Mobile “5G” connection (thats just a marketing badge they show on the phone despite being 4G LTE but thats a different topic…)

1 Like

Yep its a very personal choice where you strike a balance between convenience and privacy.
As for now I still use and recommend iCloud but with my eyes wide open and prepared to leave at moments notice.

Personally I’m not happy with Craig’s “non-answers” and “non-apologies”. He’s giving contradictory answers to if Apple is scanning photos on your iPhone.

The core problem is that most people are happy with the current policy but not the technology. Because once the technology is in place, policies can change rapidly under pressure. Look at how Apple changed the policy of using NCMEC’s database only to using intersection of at least two independent organisations, in a week, under widespread criticism.

1 Like

That remark is totally uncalled for, and not the tiniest bit on-topic.


Agreed. In a contentious topic within a community, a statement made about a community member is likely counterproductive, whereas a statement made about the topic itself, if it’s not qualified with language that denigrates proponents of an opposing view, is contributing to meaningful dialog.

I think that we need voices that don’t agree; it’s the only way to truly understand the depth of a given issue, but it’s incredibly important to keep things from becoming personal.

1 Like

your remark is totally uncalled for. it was a joke. I post there.