[Trainer-Talk] iPhone with no home button
Brian Vogel
britechguy at gmail.com
Tue Nov 19 18:37:21 UTC 2024
Raul Gallegos wrote, in part: "If it takes an entire lesson to get somebody
to get comfortable with unlocking the phone or activating the home gesture
to get out of an app, then this is what we will do. The phone is utterly
useless. If somebody cannot perform the home gesture. Don't get me wrong, I
understand using a modified voice over gesture to do something similar, but
I still believe that this is an essential skill to master."
I respectfully have to disagree here, but only on the "how long we try"
part. I have had certain students who seem to be able to instantly "get"
virtually any gesture while, for others, it's a real struggle. I will go
through the "real struggle" period for longer than a single session so that
I can attempt to assess why they're not getting what they are not getting.
Sometimes it becomes obvious what's "off" and other times, not.
I also can't count how many students of touch devices I've had who are just
insanely frustrated at times, and often for weeks, when first working with
this type of UI, and all the more so if they were formerly users with
sight. It's often more difficult to unlearn what you know that's accurate
when a screen reader is not active as it is to learn what is accurate when
one is working with a screen reader for access.
Trying to get certain students to understand "explore by touch," and
particularly formerly sighted students, can be very challenging along with
the idea that, generally speaking, "everything you used to do with a single
tap is now done with a double tap." I also find it interesting how many
students (and this is both formerly sighted and those who've never had
vision) want to persist in trying to target buttons perfectly accurately,
often screwing things up badly because the slowness they induce trying to
do so causes their action with what should be a double tap to turn into,
effectively, two single taps and causing accidental exploration by touch
and shifting of focus. I often have to "beat it out of them"
[metaphorically speaking] and get them to understand that one of the main
purposes of a screen reader in a touch UI is to separate gaining focus on
the thing you want from activating that same thing. Many carry over the
concepts of "point and click" which requires that what's being
clicked/tapped by the finger MUST be under said finger. That's not true
when using TalkBack or VoiceOver. Your first step is finding the target
you want. The next is activating it, completely separate from any care
about where it so happened to be on screen at the location stage.
All of the above being said, there have been instances where it becomes
apparent that the touch UI and a given individual will never work well
together. But I never come to that conclusion until multiple sessions have
been conducted.
I also find it fascinating, and have still never figured out why this
happens or have any student who can explain it, either, how after weeks of
frustration and struggle, there will be one thing that finally works and as
of that moment all of the various puzzle pieces instantly fall into place.
It's a technological Road to Damascus moment. It's very often accompanied
by some variant on the observation, "I have no idea why I didn't get this
long before now."
Brian
More information about the Trainer-Talk
mailing list