Parents who use Google Photos, be extremely careful what you upload

NY times article re dad who sent photos of his child’s genatalia to his pediatrician for medical advice.

As a criminal defense attorney who is a father of two small children, I am very thankful to not have had many cases involving the recorded sexual assault of children. These cases are brutal and the problem is extremely serious. But this case could have gone horribly wrong for the parent and child involved. The law enforcement investigator concluded that the father DID NOT COMMIT A CRIME. Nevertheless Google maintained its position and is deleting the father’s account. Had the government entity involved taken Google’s position, the child in this case would have likely been removed from his home for however long it took the government bureaucracy to process the case (as I tell my clients, the wheels of justice turn slowly but grind exceedingly fine). In such a case, the child could be exposed to irreparable harm.

This is scary. My wife is a pediatrician and I immediately sent this to her. I know her official position when these requests are made to her is that such communication does not comply with hippa, but when friends and family call at 2 am worried about their children, it’s hard not to lend a helping hand. Hopefully this article will give her ammunition to say no kindly.

Parents beware. Those pictures of your baby’s first bath that automatically upload to google because you set it up automatically could harm your family. Exercise extreme caution.

9 Likes

Woah, so they essentially mistook it for underage pornography? I’m definitely not storing any sensitive images in Google Photos from now on…

1 Like

It’s all a matter of perspective. As the article notes, absent context that’s exactly what it is. And of course even if you didn’t get flagged, if the account gets hacked, and the photo gets spread around the Internet you’d have “known CSAM” on your device. Which would be a whole different can of worms.

I wonder if this is how Apple’s implementation of CSAM would work too, since that one utilize on-device rather than cloud (Google).

Agree with you here, anything that is stored on the cloud or online can be hacked and stolen. Best to always caution. This is my thought as well after watching “The Most Hated Man on the Internet”.

1 Like

I think, the way Google (or any other company in the same situation) is handling those events is absolutely correct.
The first guy in the article, even wrote parts of the code, that is finding those pictures, by himself, and was not triggered by that, to not take a picture of his sons genitals and upload it?!
Someone who is acting in a way like this, will probably do it again, and the company has no obligation, to use his sources on people like that!

It is pretty simple, don’t take pictures of naked children, and you won’t be in trouble by doing so!!

Are you serious? Did you read the background and see the context?

Next you’ll be trotting out the old “think of the children” and “if you’ve nothing to hide, you’ve nothing to worry about” rubbish.

6 Likes

As a parent of a child with a gastrointestinal problem photos can be extremely helpful for the doctors, and parents, to understand if a problem is getting better or worse. It’s much better than memories, or even a notebook, because it shows what was actually happening.

We haven’t had to use photos much recently, but I still have an album of poop/blood/mucous mixes in various forms in case we need it in the future. Hopefully we never need to take photos of the exit point, but if we do, I guess I’m using my fancy camera? (Which won’t seem suspicious at all…)

1 Like

Agree with @aardy. The issue doesn’t sound like intentional uploading, but rather like a “hey Google, back up all my photos to the cloud” setting. Then when he took photos as requested by his doctor, Google “helpfully” uploaded them to his cloud account, flagged them, and nuked his account.

I think this illustrates one of the dangers of some of the “automatic” decisions technology makes for us. If you have your device set up to back up your photos to the cloud, that’s usually a very good decision. But if you’re stressed out because your kid is in serious pain and you need a doctor’s opinion, you’re not going to be in the place to think “gee, wait a minute…how will taking a photo for the doctor possibly be misinterpreted by my cloud provider in light of my backup / cloud replication strategy?”

To use a potentially-more-common (although hopefully not) example…

Say you have an early teen that’s just getting to dating age. By my understanding, it’s not all that uncommon for teenagers’ peers / potential dating partners to ask them for inappropriate pictures. If the kid takes such photos, and they’re under whatever age is required for consent, that’s child pornography by definition. Imagine Google flagging that and nuking a family account due to a bad decision on the part of one kid.

And before we say it would never happen, take a look at all the various times Google has flagged and removed accounts without an appeal process. In the case of OP’s article you not only have a parent asserting their innocence but a police department and a doctor on the parents’ side as well. “The automated system triggered us, we made a context-free decision, and that decision is final.” And it’s not just a “what do you expect from a free service” question, as one of the people in the article was a paying user.

Given how much of our lives is tied to these cloud provider accounts, that’s pretty scary.

5 Likes

As I wrote, it is pretty simple, just in general don’t take pictures like that, then you don´t have to think about the consequences.
And if you want to have privacy, don’t use Google, that is as simple as that, too!

This guy knew about the searches, he participated in the programing! So he should know better, and he did not observed those basic rules, beside he knew it!

Also very simple, talk with your kids about that problem!
Tell them the consequences of taking pictures like that, not only criminal, but also from the social site, if pictures like that go into the wild at school, or wherever in the community!

The provides pictures and Video were serious enough, to get a judge issue a search warrant! That says everything about what was more likely visible on those materials! It was for sure not so “innocent” as presented within the press, who obviously wanted to made some point with that article.

Using digital utilities, does not relive the user from common sense.

If you use Google, or something similar for storing the photos, you should think about that a second time…

See, to me this is the core of the issue. “In general” neither of these people took “pictures like that”. And “common sense” tells me that “taking a picture for my kid’s doctor” is nowhere near the same as “child pornography”. :slight_smile:

Google possibly made the correct initial decision. But the fact that there’s no mechanism to get things restored after everybody in the appropriate position to determine whether a crime had taken place determined that, in fact, nothing bad had happened? That’s a problem.

“In general”, detectives don’t drop cases with clear-cut cases of child pornography.

7 Likes

I would use the “big” camera to avoid auto-uploads and give me complete control over where the files go, so I can make sure they stay as far as possible from cloud-based workflows. Perhaps so far as creating a backblaze-exclude folder and excluding it from Backblaze backup.

In the past I’ve shown photos to doctors in-person on my phone. Going forward I guess I’ll need to figure out a way to exclude some part of my phone from iCloud backups.

1 Like

Respectfully I disagree with you. I am not sure if you read the article clearly or not. A parent was distraught over their child’s medical issue, it’s 2021 per the article, height of covid. As a parent, I fully understand the pains of what the father is thinking and acting in response. We had a similar incident with our own child, a few years ago, spent nights in the hospital, then upon release, the doctors wanted daily photos to see the recovery until his next appointment (we lived a few hours away).

Does Google have a good system in place? It sounds like it, yes, it does the job. Does Google have a system to say ‘oops we messed up’ - clearly not. Does Google have a better way to explain anything? Not at all - based on the response in the article when it said, ‘we noticed the photo didn’t have the redness you were claiming’ - so suddenly Google also has a medical degree - that is an issue. This is also a big concern of Big Tech, there is no balance from a consumer perspective. The police cleared it, but Google has their own decision, that’s another issue. So even if you are innocent and police say you are innocent, someone else clearly has more authority digitally speaking.

8 Likes

One of the problems in that area is, that even pictures taken for some other (“valid”) reason, have good chances to get into the wild, and will end up in those areas, where they don’t belong!

A lot of the pictures “used” by pedophilist could be explained like that, so where do you make the cut?
And how much effort should a company have to take, to get somebody, who clearly violated the regulations of the company, “back into the system”?
And why should the company have to take that effort? There is no right to contract with a company like Google, specially after you violated their rules and regulations in a manner that serious!?

Oh, and by the way, for the communication with a doctor, there are special ways (Video, special apps) to do so without getting into a trouble like this.

1 Like

But see, that’s the whole issue here - you’re assuming the conclusion in your argument.

In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”

Google has banned CSAM. And if this guy had uploaded CSAM, I would be siding with them. The issue is that Google decided something was CSAM that - according to every relevant and competent jurisdictional authority - was not.

So Google has banned this guy for violating their policies by uploading CSAM, which he didn’t do. To the extent he has a contract with Google based on a ToS (and at least one of the people in the article was a paying user, so he certainly did), he’s not in violation of the CSAM provision.

I realize this is an issue, but as for where I’d make the cut - when a competent legal authority with jurisdiction determines there’s no problem, the ISP should concede that there isn’t a problem. :slight_smile:

5 Likes

It is fully OK, that you disregard with me about that, and yes, I read the complete article, but from my point of view, that is not an excuse for taking pictures like that, and uploading them into a system like Google.
And, even if a parent is scared by something like that, it was not a live threatening situation, and as a parent, you should rather stay calm in a situation like that, because your child needs you with a cool mind!

Google is a private owned company!
They are simply not obligated to contract with everybody, and if you violate their rules, you are out.
That has nothing to do with “innocence” or “authority” but reflects simply a valid companies Decision.
You also won’t be served, if you run into a dinner without shoes and shirts, it is simple like that.

Make the cut, by general not taking pictures like that!
Why should it be different to other topics, where you have to obey on standard rules, to keep your environment safe?

Maybe you should just be banned from this forum for being completely unreasonable, even though you just have a different opinion and are completely innocent? This is a private forum after all.

4 Likes

Frustrating situation. Our doctor-patient communication app lets us take photos from the app and attach them immediately, so they don’t get saved to iCloud photos. However, there is nothing indicating that this bypasses iCloud, so most users wouldn’t be aware. The staff at most medical practices would be just as unaware. Since it’s an awkward workflow, I can also see how parents would end up taking multiple photos with the camera app and mass attaching them later, causing trouble.

1 Like

This reminds me of when my 3 year old, while watching a kids’ show on his Kindle Fire, decided to take naked selfies in the bathroom mirror! We found them a year later, in my Amazon cloud storage, when I was setting up an Amazon FireStick and choosing a profile photo.

My heart sank when I saw them as they’d been there a year. I’d had no idea that for a year I’d had naked selfies of my son on my Amazon account. I obviously deleted them immediately but I sought advice from the Parenting in a Tech World group on Facebook who told me all the various places these photos might have been backed up. I felt sick.

I work in tech as a digital manager, I am a nerd, I have friends who investigate child pornography for the police, but even I was caught out. So I bet this happens LOADS.

The issue here isn’t “just don’t take photos like that and you’ll be fine” - this happens for lots of nuanced reasons.

It isn’t “Google shouldn’t have flagged it” - they should.

It’s the behaviour of Google afterwards that concerns me. The fact that the account is now permanently deleted. These men are still suffering the consequences when they should’ve been helped to restore things to how they were before, minus the “offending” images. I also don’t like how Google seems to have taken on the commonly-used phrase “Dr Google” as an actual professional identity.

I’ve been considering moving away from Google Photos myself, and I think I probably will when Apple Photos allows family sharing. I’m also a Google Drive, Gmail and Google Calendar user, and it feels a little like a “too many eggs in the same basket” sitch for me…

5 Likes