NeuralHash - Apple Scanning Your Photos

Apple will begin scanning photos uploaded to iCloud against a database of known images based on a file’s hash in an effort to help prevent child abuse. They say the inspection is done on device, but once an image is flagged it can trigger a “manual review” process. This is a really bad idea and has serious potential for misuse. They’re going to start by scanning unencrypted iCloud photos because it “doesn’t violate anyone’s privacy”. They will then move on to inspecting “encrypted” photos and roll this out in iOS 15. This is a slippery slope…

Apple confirms it will begin scanning iCloud Photos for child abuse images


The EFF’s take on this:

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.

Apple’s Plan to “Think Different” About Encryption Opens a Backdoor to Your Private Life | Electronic Frontier Foundation

You cannot cure one moral outrage by perpetrating another.


I agree, those news shocked me a bit yesterday evening. Of course, it’s only for the children, so who could speak up against this? That’s even a stronger “argument” than fighting terrorism.

There is so much wrong in what seems like such a noble cause. This is basically the equivalent to “You have to allow the police to search your house, your computer’s hard drive, your diaries, your everything (which is what a smartphone has become to many of us) daily”, and you’re not even allowed to supervise the cop.

  1. A closed database (obviously, nobody wants to see its contents and it’s forbidden to share its contents) which might contain innocent false-positives (weren’t there reports of false-positive flags from other services such as OneDrive?).

  2. A closed “AI” algorithm to find close matches. From the company that brought to us Siri. I’m sure that’s not going to increase the rate of people who will be falsely alleged of one of the very worst crimes.

  3. Once this infrastructure is setup (to scan first just your photos, then everything; and then to upload it to a “safe destination” for further inspection), even if Apple has only noble motives, it will be hard/impossible for Apple to deny using that infrastructure to fight other really, really bad crimes. It will start with suspected terrorism, but then there will be scope creep.

It’s not like there were some interesting leaks by that Snowden guy. Naa, it’s probably going to be okay.

And in a few years time, you will be jailed for privately sharing funny Trump memes, because the president re-elect decides this is illegal propaganda.

Not to mention what kind of information certain other countries would like to have checked, because, you know, you still want to sell your stuff in our country, right? And we really feel you shouldn’t be gay around here, or shouldn’t talk badly about our government.

  1. There will be security holes, data leaks, new attack vectors like sending innocent-looking pictures which will fool the AI into thinking it’s a match.

And let’s not forget that at the same time (really, just to make sure your kids are safe!) Apple is implementing filters to detect porn and sexting in iMessage. That’s probably a whole different story and who could argue against saving your children from doing a terrible mistake, but then again, read points 1-4.

And that’s just my random thoughts on this. I’d better not read that EFF article, because I’m already shocked enough by Apple’s new definition of privacy and user rights.

If this rolls out (And if it doesn’t… how do I know it didn’t? Until now, at least there were good reasons to think Apple wouldn’t implement secret backdoors on our phones.), I will probably stop buying Apple products, not because I would have anything to worry about even in escalation stage 4 or 5 or 6 (how very comforting it can be to know you’re not part of a repressed minority…), but because this is basically the beginning of the end of privacy and civil rights even countries such as the US or Germany.

People living in less democratic countries certainly won’t have much of a chance to fight for these rights and stand up against those plans, so who, if not we, could and should voice our opinions?

And as a side note even in the US and in many EU countries, who knows which type of government we’ll have in 5 or 10 years time.


Apple’s public explanation of the change.

Child Safety - Apple

1 Like

I see no reason to stop buying Apple products because of this. You can be certain that Google, Facebook, etc will be quick to follow, if they aren’t already doing something similar. OTOH, the illusion that Apple believes in user privacy no longer exists.


Facebook already do this according to today’s dithering.

1 Like

I’ve never used FB, but that’s not surprising.

Well, but this is definitely one step up from the routine scanning of unencrypted files-at-rest on your Google Drive or even automated analyses of unencrypted messages sent via a certain messaging service (Or did FB already switch to “end to end encryption” and has now implemented a sort of upload filter, as the EU wants to have it? I’m not up to date on this.)

While the mechanisms are, in my personal opinion, also a tool of mass surveillance which could easily be used for evil (by governments forcing Google, FB, … to utilize the tool according to their standards, by hackers utilizing the functionality against the company’s will, or just by accident because the algorithm decides it’s time to send in the feds and leave it up to you how you want to explain that to your neighbours), Apple explicitly creates the infrastructure required to scan your local files, which are locally encrypted.

It’s bad enough you need to take all that into account (or simply put: cannot put privacy for granted at all) when choosing your online sync services etc., but taking into account that basically everything you put on your everything-device (even if it’s just photos to be uploaded to iCloud right now) is not going to be private sucks.

I don’t know yet if an Android phone with a custom ROM is a proper answer to this problem (considering overall platform security), but once it has started, we can be sure that versions of macOS will also receive the “you’ve got nothing to hide so why do you complain”-treatment. Hopefully, some macOS app designers will be appalled enough by this to bring some proper user interface to Linux applications. :wink:

1 Like

The case of Google and Facebook looks totally different to me. They scan what you store on their servers. They are not installing a spyware on your devices that’s capable of scanning your local files.


There’s been a lot of misinformation lately.

If you want the facts, take a look at this news article from MacRumors and maybe read into the comments.

That’s only half the story. :slight_smile:


This is not about being misinformed.
As kennonb mentioned, there’s two new features (not just one):

  1. iMessage scanning for unwanted image material as a parental control.
  2. Automated local scanning of your photos (before being uploaded to iCloud) by an AI called “NeuralHash” to look for known illegal content.

Very nice and agreeable ideas in principle, right? But the fact of the matter is, this is a very slippery slope.

Once you have setup iMessage image filtering including automated iCloud- (or whaterver-) based alerts of parents, it’s more or less only a matter of configuration to add more depth to those filters in order to scan your messages for illegal content and then alert the police instead of your parents.

And once you have setup automated scanning of all photos, it’s easy to roll this out to a) all users including those who do not wish to upload their photos to iCloud and b) to additional local resources, e.g. all your files in the Files app.
Plus, it’s easy to add new categories of “unwanted” or “illegal” material.

Checking all your files and messages, and checking for more illegal content than initially announced, is the first stage.

The second stage will be to scan for stuff which we would probably not agree anymore should be deemed illegal. That’s where less democratic regimes forcing Apple to use their nice and democratically sound toolset (it’s only for the good of the children, and well while we’re at it, against terrorism) against what they consider “terror” (i.e. anyone voicing an opinion the president does not like) or “illegal” (being a member of a certain religous group, being homosexual, …).

Once this infrastructure exists, it’s not as easy as today to simply deny such claims.

Plus there are many more reasons to be worried about this development, from undermining general security to being open to potential misuse through third parties, to name just some examples.

There’s probably a lot more problems ahead which we currently can’t even imagine.


I already disagreed with the Apple bubble’s canon that repeated the marketing that Apple is truly a privacy concerned company once we got to know that they dropped backup encryption.

Furthermore, a lot of the data stored in their cloud is not even properly end-to-end encrypted at all. Encryption in transit and again at rest with keys managed by the system is half-baked at best.
Without any option to allow the user to deliberately pick the keys, which never get transmitted anywhere and having external audits in place that verify that the encryption is actually happening the way as proclaimed, I would always call for caution.

The fact that Apple is still sticking to a USB 2.0 wired connection with phones that often hit 128 or 256GB seems like another deliberately put in place inconvenience to get reluctant users to switch to the faster and more comfortable iCloud backup. All just to pave the way to finally remove any physical connection whatsoever and with it most likely the local backup option in general.

Putting all users under general suspicion and scanning all photos is the most invasive way to circumvent end-to-end encryption, ignore user opt-out and the user’s privacy in general and it is an irreparable damage done to the trust one might have had in Apple. This will be a future negative textbook example in marketing classes on how you can lose your over years if not decades carefully crafted “privacy conerned” facade with just one dire move.

Even if Apple back-paddles and provides an “opt-out switch” for on-device scanning the trust is already lost. And I am not even sure whether it’s possible to ever repair this.
Not to mention that this opt-out—if it ever will exist—will most likely come with high inconvenience cost. As the switch will probably deactivate any on-device (Siri) smarts (Spotlight indexing, face recognition, text recognition) to have a strong pull to keep it activated.

And no, this is nothing against potentially good intentions to solve a despicable problem.
Everybody with a glimmer of technical knowledge should be able to put together the puzzle pieces and figure out all the different angles for abuse of those systems. We are openly criticizing China for their “social score” system, yet, we put all the building blocks in place ourselves.

I can recommend a book on that topic:


I don’t disagree with your assessment. I’ve been expecting Apple to begin eroding our privacy ever since Australia passed a law in 2018 “that will allow the country’s intelligence and law enforcement agencies to demand access to end-to-end encrypted digital communications. This means that Australian authorities will be able to compel tech companies like Facebook and Apple to make backdoors in their secure messaging platforms, including WhatsApp and iMessage.” - 12.07.2018

Once Australia actually uses this law I have no doubt Apple will comply as they have in China, and then with every other country that demands the same access to our data.

It isn’t. At best you can choose to not put anything online that you consider private. And don’t keep any sensitive data on your Mac that is not individually encrypted, because you can no longer rely on FileVault if you chose to “use your iCloud account to unlock your disk and reset your password”

On my last trip to rural Mexico I took several photographs as I and my companions, and several local friends shared a meal outside at a private home. Later when reviewing the photos I noticed a couple of young mothers in the group nursing their infants, as is their custom. Should I now be concerned about a false positive getting me in trouble? Probably not, but why risk it? I’m done with Apple photos.

A slippery slope argument is a fallacy.

This is a great idea by Apple. Your phone can find pictures of snow, dogs, trees, beaches, etc… using internal processing. It isn’t a big stretch to flag them for child porn. If flagged, having that image reviewed by a person would bring false positives (and I feel sorry for people at Apple having to do this work).

this is really worrying. I agree with most of the thing already said.

I’d like to add that fighting crime is not the role of a tech company. In any way at all.

1 Like

we’re already slipping down that slope. and gaining a lot of speed…


Linked to this discussion and potentially of interest:

A fallacy doesn’t gain validity

I wasn’t a big fan of iCloud before this new nanny policy was announced so it only took a few minutes to remove all my files and photos from Apple servers.

Everything was already duplicated on both my Google Workspace account and my Microsoft 365 OneDrive. AFAIK neither Google or Microsoft currently scan the photos, files, or email of their commercial customers.