Ah… I totally misunderstood the message then…
And, thats incredibly insensitive and stupid of me
What might be a solution, which I’ve been working on with my mother, who also has lost her sight is look at the microsoft “Seeing AI” and have that detail the picture I mentioned
There’s two people behind microphones in the picture having fun. One is about 50 with brown hair and glasses. The other one is about 33 (@my comment: which is fairly specific (@ismh is it in the right area?)) With glasses, brown hair and a beard.
I’ve found Apple’s implementation of the accessibility features robust when you have sight or are fiarly computer savy. Being neither puts my mother at a distinct disadvantage.
I can usually work with her to get out of her iphone and iPad what she needs, but each OS update sends her into a panic. What if this…? What if that…?
Example from the iOS 14 update:
She used to be able to answer a phone call from the speaker by pressing the top of the bright blob on the screen. Now she must aim for the dead centre of that amorphous blob she can’t quite make out, or she’s answering the call on the iPad in the next room. (Handoff now shows up)
Any change in interface, options, image colour or sound makes the world a terrifying place for her. I understand us worrying about automation options, scripting and the like. But sometimes we forget the environment we like is the only one a lot of people have to interact to their world with.
Can she also double tap to answer the call in speaker mode? That’s the problem here.
She can still see the swipe area a little, so swipes to answer, but she would like to switch the sound to speaker at that point. And that is what has changed
Can it help to turn on speaker mode automatically? In VO settings - Audio ‘Automatically switch to the speakers during audio calls when you are not holding the iPhone to your ear’.
It works both ways, so if you holding phone to ear, it will switch back to default mode.