Why does Live text only work on the photos app

The Live Text feature in iOS 15 is very useful and can OCR sheets of written text. As far as I can make out this only works when you take a photo via the photos app.

Why then isn’t this feature include when you scan a document…either in the Notes app, or directly into the Files app?

It’s not the taking of the picture via the camera app. it’s the processing of the image within the OS.

i.e. you can view images as part of a website and copy text from the image files.

Is it possible to process scanned documents ? After all they are an image, although may be saved as a PDF. I cant see any way to do this.

Live text is system wide on my devices. Which is to say it seems to work in almost every Apple app from Notes to Files to Mail for images I’ve saved off of the internet or sent to me via messages or photos or scans that I have taken.

Definitely working outside of the photos app.

1 Like

I see what you mean. I can scan a document and highlight the text in the Notes app.So it clearly is doing OCR on it.

However when I open that document in Preview…or export it in Devonthink it seems not to be a searchable PDF anymore and you cant highlight the text.

I am not sure how to resolve this

The system identifying text in a document is not OCR in the traditional way we use the term OCR which completes a one off analysis of the document at the point of scanning and saves the results as part of the document.

When the system finds text in a photo, that isn’t saved as part of the photo, I’d guess it’s done at the point the photo is opened, that’s why that information isn’t searchable either.

If the identified text is not saved within the document, it can’t be searchable.