Unanticipated use for Apple Intelligence

I came up with a good use for Apple Intelligence (to be coming to macOS Sequoia probably next week). We got a medical testing report (don’t worry – it was OK) full of Medical-jargon-speak. I got the idea to select the text and use Apple Intelligence to rewrite it in a Friendly style. It cleared up all the jargon and made the report readable. Highly recommended!

11 Likes

Until my family doctor retired last year he used to give me a printout of any tests then explain each item in layman’s terms and make notes on the page about items that might need attention. Now I get a text telling me when I can download my test results. I never considered using AI to fill in the blanks.

Great idea, thanks.

1 Like

I had a similar experience when my family doctor forwarded on a dictated letter from a surgeon.

I would normally ask my wife (who is a doctor) to translate it for me, but I took a photo and uploaded to ChatGPT and it explained it in easy language.

It also told me, in an unrelated episode, that the weird flashing symbol on my car dashboard meant that one of the tyres needed a little more air in it.

1 Like

ChatGPT APIs make for pretty good OCR for test results, too. When Apple Intelligence launches deeper integration with app intents and (hopefully) some third party integration, I’m hoping that sending data to Apple Health, analyzing/collating with AI, and sending to another app will be viable.

Did you redact it before uploading it?

3 Likes

I wonder if it can turn my doctor’s typed notes into English.

To be clear, they are already English but I guess she is very short on time, given the number of abbreviations and typos.

1 Like

Yes using ChatGPT is risky for personal data (like your medical records!)The one great thing about Apple Intelligence is the information is kept secure. In fact at the moment it never even leaves your device as their secure AI servers aren’t up yet.

1 Like

Given the amount of data that is constantly being leaked I doubt I need to upload anything. ChatGPT, etc. probably already has all my info.

August 2024 Healthcare Data Breach Report

1 Like

If you use ChatGPT Plus there is an option to turn off data sharing.

If you access ChatGPT via API then the data is not used for training.

You do not need to write your own code to use the API; there are many front-end UIs for that - of which I recommend TypingMind

1 Like

That would be interesting. I could discuss a technical issue for a client and then translate it.

Here in the UK we have an NHS app that amongst other things allows patients to read doctors notes and test results. There are abbreviations aplenty in those documents so the NHS provides a glossary list with expansions and explanations of the term. Of course that list doesn’t help with typos but with a little bit of imagination it becomes clear what the doc meant.

1 Like

I don’t feel bothered sharing stuff like that with ChatGPT.

If they can use the fact I have a blocked eustachian tube, good on them!

Do you think I should be?

Yes you absolutely should be worried who gets your health data. It is your private information and you don’t want companies, or others, using it against you. In your scenario, imagine if a hearing aid company had access to your blocked Eustachian tube diagnosis. You can be spammed from here to breakfast time with emails for hearing aids (useless in this case) or your insurance company my exclude hearing aids from your policy (this might be a problem when you are older).
What if the medical record you decided to present to the internet has more sensitive or stigma-laden diagnoses? HIV, syphilis, herpes, schizophrenia, terminations of pregnancy, etc.?

1 Like

I’d be very careful here. AI could produce nuance to the data that could be misleading or downright wrong (the devil, as always, is in the detail).
AI is being used to “read” X-rays and ECG’s (or EKG’s for those folks in the US). It mostly does a good job, except for when it doesn’t. AI misses subtle or rare things. As an example, AI is used to read chest X-rays, but they can miss incidental findings outside the lungs (eg. osteoid osteoma in the proximal humerus).

Your family doctor might be using AI medical transcription software for the consultation, they should obtain your consent first. The software will typically generate a patient information summary of the consultation (written with a reading age of about 9-11). These are mostly good, clear summaries and some patients find them helpful.