iOS 13 Photos - machine learning and People album

If you’ve upgraded to iOS 13.x, you might have noticed there are two places in the Photos app where it points out you should plug in your iOS device and lock the screen to update your Photos library on your iOS device. So far, I’ve found no reviews that cover what I’m about to describe.

The first place where this (machine learning?) happens is in the Photos tab where is it creating galleries and layouts. This takes a bit of time depending on how many photos you have. I have about 100,000 photos (80K already processed and 20K to go) and it took several nights of plugging my iPhone and iPad, opening the Photos app, and locking the screen. (I don’t know if I had to open the Photos app but I didn’t do it one night and the count of photos remaining to process didn’t change. YMMV.)

The second place where this (machine learning?) happens in Photos is more problematic. It is the People folder in the Album tab. This seems to be a separate process and scans dramatically slower. (This makes sense - the work to recognize people/faces is harder than simply organizing photos by existing metadata.) I also think it only starts when I plug my iPhone or iPad, open Photos and navigate to this album, and then lock the screen. It does maybe 1,000 photos per night and I have about 71,000 left to go on the iPhone and 82,000 remaining on the iPad. This could be a looooong process and so far in my experimentation, it only happens if I “initiate” it. (I’m going to test again tonight by not doing the steps and seeing if the count changes.)

Does anyone know if this can be sped up or optimized? I doubt it and it seems device-specific since Apple is likely doing this on each device for privacy/security reasons. Also, I’m curious to see if the process will happen automatically because I don’t expect most users to ever discover these steps, if they are required. I’ll now more tomorrow.

However, I have to say this is one place where I wish Apple would do this processing in the cloud (like Google) and just send the metadata down to my iPhone and iPad. Maybe for security, they could do it with some user-created key or version of my iCloud password, but I’d rather have this info centrally stored and updated in near real-time on my devices instead of waiting for people to be recognized in photos nightly or whenever my phone is plugged in and locked.

Hoping someone with more iOS 13 experience can shed some light on this. Thanks.

1 Like

I switched to iCloud photos earlier this month, and I think it got through faces on 7000 photos in about 3 days on iPhone 7 and Mac running High Sierra. If I’m not mistaken it does now transmit face info from one device to another via iCloud even if processing is on the devices only.

It happened without me taking any action.

Starting in High Sierra, “If you use iCloud Photo Library, the people identified in your photos are synced across your devices.

You just put into words what have been bothering me for months which I am just too lazy to type to find out a resolution.

Yes, very slow. In fact, one of the things I did to speed things up is to download the original, high-res copy of my iCloud photos into the Mac (by unchecking the space saver thingy). I read somewhere that when you have all these high res photos locally, the machine learning is faster as it can act on the local photos. To be honest, I am not sure if it was faster or not, but yeah, it is STILL taking its time.

This is faster for me on the iPhone when upgraded to iOS13… maybe just a day or two (30K photos).

65,000+ photo library here. The problem I have with the People album in iOS 13 is that I had a perfectly good People album (or Faces - I’ve forgotten when that switch was made), and not all of my favorite People carried over. For example, my parents are missing from the album entirely, and when I tried identifying my father’s face in a newer picture, it added him to the album for my daughter instead. I hope it shakes out soon.

It is going faster, about 10,000 pictures each night, and seems automatic. I do wonder how I will tell it to merge people together - sometimes it has “John” and “John Smith”.

In the “People” album drag the tile for “John” onto the tile for “John Smith” and it will ask “Is this the same person?”.


Also, oddly, you use the share sheet to indicate if a photo is not matched to the correct person.


Apple does so odd things with the share sheet, like letting you like/dislike News articles from within it.

My Mac finally scanned 29K of my iCloud Photo Library peoples faces. Just 1K to go. However, when I look at the People folder on my iPhone, there’s still 13K to be scanned. I’d think these info are sync across devices, aren’t they?

I think the scanning happens locally but the results are shared.

It does not look like the result was shared though. My phone still show 13K to be scanned.

I just updated my other MacBook to Catalina and this is what I get in the Photos app. Wow, it will be dumb to scan these all over again! 30K photos took my other Mac (on Mojave) around 20+ days to scanned 30K photos. My iPhone still have 13K to be scanned and this other Catalina MacBook looks like it’s starting all over!

This is not what I am expecting…

1 Like

That’s … suboptimal. I bought a new MacBook Air two months ago, the scanning process started, I dumped coffee on it three weeks later, got it back from AppleCare 8-9 days after that, restored from a Time Machine backup, and it has finally started scanning. (65,000+ photos). By the time I’m ready to upgrade to Catalina, it will be finished just in time to start over.

1 Like

I do think this is an area i wish Apple would let me choose to sync this data. My iPad Pro processed my pictures in a few days, my iPhone 11 Pro processed the pictures in about two weeks, and my Photos on my Mac is yet a third area. This is a lot of management. If I start editing people, am I going to have to do these edits on each device? I think each device has somewhat different metadata.

Still 50K+ photos to finish…

1 Like

Not in my experience.