[Trainer-Talk] What’s New in iOS 10 Accessibility for Blind, Low Vision and Deaf-blind Users

Scott Davert scott.davert at hknc.org
Tue Sep 13 17:14:32 UTC 2016


http://applevis.com/blog/apple-braille-ios-news/what-s-new-ios-10-accessibility-blind-low-vision-and-deaf-blind-users

Another fall is upon us, which means football season has started, temperatures are starting to drop, and that a new version of iOS is coming out. Another
seemingly established tradition is that another article discussing the new accessibility features is in order. Just like the past several years, there
are many changes to iOS that are mainstream and will be welcome changes. Some of the new features include a revamped Music app; a new Home app; new 3d
Touch functionality in Apple apps; and much more. Many articles will cover these changes, but the aim of this piece is to cover accessibility changes that
are specific to blind, low vision, and deaf-blind users.

Mainstream Stuff Impacting Accessibility
Wow, Siri Can Do That?

One of the major enhancements in iOS 10 is the development of
SiriKit.
Specifics aren’t really known about this exact functionality just yet, as SiriKit-enabled apps are only being released alongside the public version of
iOS 10. However, what we do know is that there are many potential benefits to SiriKit. For example, if Uber were to implement SiriKit, one could simply
tell Siri, “book an Uber for me to go to Central Park”. In theory, at least, Siri should now be able to carry out that exact action. According to
this article,
apps that will utilize SiriKit the day iOS 10 is released are WhatsApp, LinkedIn, WeChat, Pinterest, Vogue Runway, Pikazo, Square Cash, Monzo, Slack,
Looklive, Lyft, Fandango and The Roll. It’s not possible to know how each app will utilize Siri Kit remains, I’m sure someone will cover it in great detail
by the time you are reading this article.

Describe it for Me, Please?

Another major change is that iOS will now attempt to add automatic image descriptions to your photos. According to a presentation as part of the
World Wide Developer Conference,
iOS performs eleven billion calculations per picture to determine what objects are in your photos. It cannot only detect objects, but also facial recognition
which can then compare pictures in your lists of contacts to tag people automatically. Indeed, it is now included as part of the information on each photo
in my library. While Apple indicated that your photos are not sent to a server for recognition—meaning that it’s all done on your device—it’s not clear
to me how long it takes for these descriptions to show up.

Press Home to Unlock

In iOS 10, you can now unlock your iDevice simply by pressing the Home button. While this was always possible on Touch ID-equipped devices, it is now possible
on non-Touch ID devices as well. If you do not have a passcode set up, pressing the Home button, waiting for a second, and then pressing it again will
now land you on the Home Screen of your device. If you find that you prefer the older way of unlocking your device, you can still do this. Head over to
Settings>General>Accessibility>Home Button>, and turn “rest Finger to open” on.

I read you loud and mostly clear

Another new feature in iOS 10 is the introduction of Voice Mail transcription. For those who can’t hear their voice messages, or for those who just don’t
want to listen to the phone, it’s no longer mostly required. It is, however, necessary, to play the message to make the transcription show up. You will
find the transcribed text next to the “More Info” button. The transcription is fully accessible with VoiceOver and braille, but comes with the same caveats
any automatic transcription does: it’s not 100% accurate, and any kind of noise or accent will greatly decrease accuracy.

Raise to Shut up?

There is a new feature in iOS called “raise to Wake” which wakes up your phone each time you pick it up. It also sometimes will wake up when unwanted.
While this may be a welcome feature for many, especially for VoiceOver users utilizing speech, it may be annoying. You can turn “raise to Wake” off if
you would like. You will find the ability to turn this on and off under Settings>Display & Brightness.

VoiceOver
Moving Apps is No Longer a Drag!

When you get to your Home screen with iOS 10 and VoiceOver is enabled, you will now hear that there are Rotor actions available; these options relate to
moving apps. Rotor over to the "Actions" item, then flick up or down to the "Arrange app" action and double tap. This puts you in “Screen Edit” mode, just
as if you had used the old "double tap and hold" method to begin editing apps. Next, find an app you wish to move, then flick up or down once to specify
your desire to move this app. (You can also perform a two-finger double tap on an app to mark it as being ready to be moved.) Now, you can freely move
around your home screen, even changing pages if you prefer, until you find exactly where you would like to move the app you chose. Once you locate the
place where you would like to put the app, flick up or down to see different options: you can move it to the right of the app VoiceOver currently has focus
on; move it to the left of VoiceOver's current position; create a folder with those 2 apps; or cancel the move entirely. Canceling will place you on the
page you are currently editing, and the app you chose to cancel the move for returns to its original place. (Pressing the Home button to exit Screen Edit
mode will also cancel any app move you had in progress.) Even better still is the fact that this system works great with both Bluetooth keyboards and braille
displays!

You Say 'Tomotto', I Say 'Tomato'

However you'd like VoiceOver to pronounce words, you can now customize this with the new Pronunciation Dictionary. Find it under Settings> General> Accessibility>
VoiceOver> Speech> Pronunciation. If you have not created any pronunciation entries yet, you will only find an "Add" button. Double tap this to begin creating
your first entry. You will then land on the "Phrase" text field. This is where you would type what the actual word VoiceOver reads but doesn't pronounce
how you would like. For example, the Alex voice pronounces my last name without the "a" sound, so I would type "Davert" in this field. Next to the "Phrase"
field you will find the "Substitution" text field. This is where you can try to make VoiceOver say the word in the way in which you wish, so in my case
I would type "Davvirt". Flicking one more item to the right, you will find an option to dictate how you would like the phrase to be pronounced by saying
or spelling it out. To the right of this option, you will see a "Languages" button; double tapping this will allow you to choose which languages the pronunciation
applies to. Going back to the previous screen and continuing to the right of the "Languages" button, you will find a "Voices" Button; double tapping this
button will present you with all of the available voices for that language that you can apply the pronunciation to, or you can choose all. In the above
example, Alex is the only voice which mispronounces my name, so I can double tap on that voice, and it will only change this pronunciation for Alex. It's
also possible to adjust whether the substituted phrase applies to upper or lowercase letters, and which apps in which the pronunciation is used. This could
come in handy in certain circumstances, as you can adjust what VoiceOver says based on context. If you turn "apply to all apps" off, a list of the apps
installed on your device will appear below this under the "assign to" heading. Note that once you add something to your Pronunciation Dictionary, it will
then be available on all devices connected to your iCloud account once everything has been synced. When you are done creating an entry, activate the “Back”
button. Back in the main Pronunciations screen, after the “Add” button, you will now see all of your entries listed in alphabetical order. You can then
double tap on any entry to edit it, or flick down and double tap to delete it.

Tom? present! Fred? accounted for! Allison? ready for action!

iOS 10 brings a host of new voices to users of features related to accommodations that utilize speech such as Speak Screen and VoiceOver, as well as to
third-party apps which have the ability to use built-in iOS voices like Voice Dream Reader. These voices are familiar to Mac users, who have had them available
for a few years. The new U.S. English voices now on iOS include Fred, Tom, Allison, Ava, Victoria, and Susan. To check out these new voices, go to Settings>
General> Accessibility> VoiceOver> Speech. Once you find the "Speech" button, activate it and you will now see a button which is labeled "Voice," followed
by the name of the voice you are currently using. Flick to the right, and you will find the "rotor languages" heading—below which are listed any secondary
languages you have configured. If you only wish to have one voice onboard and quickly available, you only need to set up the default option. If you are
finding file sizes along with the voice name, this means that voice is not currently downloaded to your device. Double tap the download button to the right
of the voice name to begin the download. As has been the case in the past, you will need to be connected to Wifi to download any voices not currently on
your device besides Fred. Once the download has completed, double tapping the name of the voice will then present you with a few options. Next to the heading
of the voice you will find an “Edit” button. This will allow you to delete the voice you have put on your system. You can also achieve this through flicking
up wherever VoiceOver indicates that there are actions available. Flicking down one more time will allow you to speak a sample. Sadly, this option is only
present when you have already downloaded that particular voice. It would be nice if there was a way to hear a sample prior to downloading.

Continuing to flick right, you will also see the options to do the specified rotor action items listed above as icons on the physical screen. If you choose
the name of the voice that is just to the right of the “edit” button, you will choose the default version of the voice. After the “Speak Sample” button
for the default voice, you will find the “Enhanced” flavor of the voice. At the time of this writing, it’s not possible to use the default voice after
deleting the enhanced version.

VoiceOver Settings, Expanded and Reorganized

Many of the options in the “settings” screen have been moved around. This is also true of the VoiceOver settings. One new sub menu is called “Verbosity”;
this series of options now contains the “Speak Hints” option which we all already know. It also has a new feature called “Emoji Suffix,” which speaks the
word “Emoji” after telling the user what the Emoji is. Another new VoiceOver settings submenu in iOS 10 is called “Audio”. In the "Audio" menu, you will
find some familiar options, as “Use Sound Effects” and “Audio Ducking” have both been moved here. A new feature, called “Auto-Select Speaker in Call,”
will allow you to control whether call audio automatically gets routed to your speaker when you move your phone away from your ear. Turning this off will
prevent the audio from auto switching to the speakerphone.

If you have headphones either connected or plugged in to your device, you will also find another option under this sub menu called “channels”. This allows
you to determine what channels will be heard through your connected Bluetooth or other audio output method. You can choose to have either your speech or
the system sounds routed to one channel or another. It’s not possible to deselect both channels. Also, if you have a mixer or DJ controller that supports
it, you now have the ability in this menu to configure whether VoiceOver uses your mixer or another sound source for speech output.

Auditory Validation

In iOS 10, there are new sounds specifically associated with VoiceOver for several events. When you lock your screen, there will now be a sound instead
of the verbal confirmation that the screen is locked. (As an aside, there is also a new screen lock sound in iOS 10.) If you press the home button, and
Touch ID does not recognize your fingerprint, there will also be a different sound played than when it does recognize it.

Jumping Through Messages Means Action!

The way in which message threads in the Mail app are displayed has changed. Instead of having each message in its own window in Portrait view, now all
of the messages in a thread show on the same screen. A new Rotor option called “Messages” allows you to quickly jump from message to message to help VoiceOver
users more efficiently navigate among messages with this change. This rotor option will simply show up when you have the option to organize by threads
enabled. Braille users should be aware of a bug, which you can find listed below in the “Braille” section of this review.

Speaking of Mail, the way in which previews are shown has changed. Prior to now, VoiceOver would simply read the previews of messages. As of iOS 10, you
now have to perform a 3 finger single tap to get VoiceOver to read the preview.

Braille
3D Braille!

With iOS 10, one can now press space with dots 3-5-6 on a braille display to perform a 3d Touch action. This comes in handy when wishing to launch a menu
for an app, but when you have your iPhone stored in your pocket.

You Can Work with This

Contracted Braille users may have noticed that this heading only contains words that are contractions. This is in recognition of the fact that Braille
Screen Input users will now receive spoken feedback when they type using contracted Braille. This was not possible in either iOS 8 or iOS 9, where only
uncontracted Braille feedback was given.

Yes, I Remember Who it’s From!

With the release of every new software comes bugs, or what the technology community affectionately refers to as unwanted features. For Braille users who
use the Mail app, there is certainly one unwanted feature present. Whenever you organize messages by thread, the number of the message you are currently
reading is displayed on each line, along with the name of the sender. This includes the header information as well as the body of the email. If you disabled
“Organize by Thread”, you will only see the information about who the message is from. For example, if I have my messages organized by thread, and the
first line of the message says “hello Jim,” braille users will see the following: “message 1 of 1 Scott Davert Hello Jim”. All of the information proceeding
the word “hello” will be shown on each line of the message.

Ironically, I have started using Outlook for iOS to circumvent the reading of email messages. However, there is a bug which prevents the proper editing
of emails you are writing, so I have decided to read my email in one app, and reply in the default Mail app. It works quite well with Braille, though there
seems to be no efficient way to navigate among messages when sorted by thread with Outlook. I hope that Apple will fix this bug soon, so that I don’t have
to use 2 apps to manage my email.

A Bug Squashed!

In iOS 9, speech users of VoiceOver did not lose the ability to know which wifi connection they were currently connected to. Braille users, however, only
saw the signal strength followed by “SSID” instead of the actual name of the network. This has been resolved in iOS 10.

Low Vision

While I have taken the information presented in this section from the feedback of a few low vision testers, I cannot attest to its accuracy. I am sure
that any errors are on my part, and would strongly encourage users with low vision to check out iOS 10 on another device before installing it on their
own to ensure the new operating system will work for them.

More Magnification

Among the enhancements to iOS 10 for low vision users is the addition of a magnifier. Find it under Settings> General> Accessibility> Magnifier. As the
name implies, you can use your device’s camera to magnify items in your environment. It also joins the features you can turn on and off with the Accessibility
Shortcut (triple-clicking the Home button), though I’m not able to turn it on with Siri. The software behind the Magnifier seems to resemble that of the
Photos app; as such, auto-stabilization appears to be only as good as in the aforementioned Photos app, and the same is true of autofocus. This could have
interesting implications for the new dual cameras found on the back of the new iPhone 7 Plus, as all of these features could be effected by having dual
lenses.

There is also an auto brightness function which can automatically adjust contrast and brightness according to lighting conditions. It’s also possible to
use any sort of filtering or other visual settings you have applied to iOS, which opens up a lot of possibilities. My conclusion from all of this information
is that the new Magnifier feature is going to be a welcome one, but will not replace a dedicated video magnifier.

More Mixing of the Old and New

A new menu under the “Vision” heading is the Display Accommodations menu. This contains several options, but also new features. Invert Colors and Grayscale
have been moved here, but otherwise appear to be unchanged. New in iOS 10 are a new series of color filters; these filters are designed to assist individuals
who are color blind in differentiating text on the screen. It’s also possible to adjust the intensity and hue of the filter to further accommodate this
need.

An older option, Reduce White Point, was moved from the Increase Contrast menu and placed under Display Accommodations. In iOS 9, you could turn this on
or off, but now White Point is an adjustable item which gives you more control over just how much it is reduced.

Covering More Highlights

When “Speak Selection” is turned on, there are new options for the highlighting of content when it is spoken. In older versions of iOS, this could only
be turned off or on. It now has several ways in which you can specify what is spoken. You can choose to highlight words, sentences, or both words and sentences.
You can also choose if you’d rather have the highlight style be with the text underlined, or with the background color.

I Need Further Feedback

Once you navigate to Settings>General>Accessibility>Speech, you will find a new icon called “Typing Feedback”. Within this menu, you can get feedback on
what you type independent of VoiceOver. You can have each character, word, and auto-text. While auto-text has been previously an option, the other 2 are
new. Also added to this menu is the ability to have predictions spoken if this feature is enabled. If you tap and hold on the prediction, it will be spoken
aloud.

Now for a Special Announcement

The Pronounciation Dictionary I covered in great detail in the “VoiceOver” section of this document applies to the “Speak Screen” functionality as well.
Any entries you add to one place, they will also apply to the other.

Hearing
TTY Comes TO the iPhone!

iOS 10 brings a new set of features to the iPhone for users of TTY. What this means is that the user cannot only call from TTY to TTY, but can also use
Relay services such as 711. Find the settings for this under Settings> General> Accessibility> Hearing> TTY. For more general information on what TTY is,
please see
this article.

TTY on the iPhone has 2 components: Software and Hardware. The Software TTY option allows the user to utilize the TTY software built in to iOS 10. The
hardware function allows the user to connect a TTY they may already have and to use the iPhone to make TTY and text relay calls. Note that an
adapter
is required to connect the iPhone to an external TTY. It’s worth noting that the iPhone 7 does not have a built-in 3.5 MM headphone jack, which is required
to use the external TTY functionality; it’s unclear at the time of writing whether the included lightning adapter will then allow for the same type of
connectivity with an external TTY.

Looking at the software option, you have the ability to set up a number to call to access relay services; the ability to send text as you type it or to
send as one larger block of text; and the ability to set up your iPhone so that it always answers with TTY, regardless of whether the person calling you
also has a TTY. However, if this option is selected and one receives a voice call, it will not be forwarded to the relay number configured in the appropriate
settings. If the TTY Software option is turned on, but not the feature to always answer calls as TTY, with VoiceOver at least, there appears to be no way
to accept a TTY call.

Making a TTY call could be easier. TO make a call, after turning the TTY Software Option on, launch the Phone app and dial the number. Tap call when you
are ready, and a pop up will appear asking if you would like to place a voice, TTY, or relay call. After selecting TTY or relay, the call will then go
through, with the keypad and time elapsed on the call shown. TO begin using the TTY functionality, it’s necessary to hide the keypad, then select the TTY
button to begin the conversation. It would seem to make more sense to begin the TTY call immediately, instead of making the person hide the keypad first
and then select TTY, but maybe there is a reason for this that I’m not aware of.

Once the call is connected via TTY or relay, as a visual user familiar with TTY, the text experience is much the same as far as I can tell as long as you
have the option to send the text immediately enabled. If disabled, you will receive messages in larger chunks, but they will still continue to appear in
the same block of text until the other person in the conversation types something. There is also a set of abbreviations at the bottom of the screen which
may be helpful for new TTY users.

When working with a Braille display, I turned the option to send immediately off, as I was working with an eighteen-cell Braille display. As noted above,
when you receive multiple messages using this format, they will appear immediately after the last message. For example, if I type “good afternoon” and
then send it, followed by “how’s the weather in Charlotte today?” Even though the messages were sent at different times, if the other party on the line
has not typed anything, the 2 messages will appear on the same line. The problem with this, for a Braille user, is that if you are reading along, and do
not get the cue to go ahead, once you have panned past what has already been written, you will be back in the area where you can type a text. If you need
to scroll back to see if further messages have been sent, you first have to scroll past everything you have already read to get to the new content. If
you send a short text, such as ok, the new block of text will appear separate from the old one, as you have sent a short message.

The other issue for Braille users specifically is that, while you are notified that the other person or relay operator is typing, while visually it may
be easy to read the text while this is happening, it is not for a Braille user. This is because when the other person is typing, you see “top of document”
or “bottom of document” shown on the Braille display. This makes reading what is being typed an issue, which can decrease the pace of an already slow process
of utilizing the phone system. I can tell you from experience, that even under the most ideal conditions, business owners and people not familiar with
how relay works tend to hang up on relay calls if they are slow.

I have provided the above feedback to Apple with the hope that it will lead to a better user experience for Braille users. I commend Apple for taking an
old technology (TTY) and modernizing it to make it a solution again.

Conclusion

Just like previous iOS releases, whether you should upgrade or not depends on whether the bugs present in the new release will impact you on a greater
level than you can tolerate—and whether you feel the new features are worth the upgrade. To check out a list of bugs related to VoiceOver and Braille,
check the
AppleVis
website. To download the update over the air, go to Settings> General> SoftwareUpdate, and follow the prompts onscreen. Alternatively, you can update
your device through iTunes.


Scott Davert, MA, VRT
Coordinator, New York Deaf-Blind Equipment Distribution Program
Helen Keller National Center for Deaf-Blind Youths and Adults (HKNC)
141 Middle Neck Rd.
Sands Point, NY 11050
scott.davert at hknc.org
516-393-7561 (Voice)
http://www.icanconnect.org/new-york :: HKNC: http://www.hknc.org

DATA PRIVACY AND SECURITY NOTICE: This e-mail message and any attached files are confidential and are intended solely for the use of the addressee(s) named above. The materials in this email also may contain protected health information or other information subject to protections under federal and state law, as well as physician-patient, attorney-client work product, or other privileges. If you are not an intended recipient or the authorized agent of an intended recipient, be advised that any unauthorized review, use, disclosure, printing, copying or the taking of any action with respect to the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender and delete it from your system. Thank you.




More information about the Trainer-Talk mailing list