[Trainer-Talk] What's new in iOS 18 accessibility for blind and DeafBlind users | AppleVis
Christopher Murphy
cjohnmurph at gmail.com
Fri Sep 13 17:49:55 UTC 2024
Thanks for passing this along!
I’ll be taking a look at this.
--
Christopher Murphy
Email:
<mailto:cjohnmurph at gmail.com> cjohnmurph at gmail.com
this is a CONFIDENTIAL communication and is not to be duplicated or forwarded to other individuals or organizations unless the email says otherwise. This electronic message, including all attachments, is intended only for the use of the addressee(s) named above and may contain legally privileged and confidential information. The authorized recipient of this information is prohibited from disclosing this information to any other party unless required to do so by law or regulation and is required to destroy the information after its stated need has been fulfilled. If you received this message in error, please do not use, disclose, copy, or disseminate any of the information in this message. Please notify the sender by reply e-mail and delete the original message
From: Trainer-Talk <trainer-talk-bounces at nfbnet.org> On Behalf Of Jesse Johnson via Trainer-Talk
Sent: Friday, September 13, 2024 12:31 PM
To: List for teachers and trainers of adaptive technology <trainer-talk at nfbnet.org>
Cc: Jesse Johnson <jayjohnson66 at me.com>
Subject: Re: [Trainer-Talk] What's new in iOS 18 accessibility for blind and DeafBlind users | AppleVis
Hardy, I am Jesse Johnson thanks for the info. I would like to know if there’s any way to transfer information from the iPhone to the braille display. Thanks in advance.
Sent from my iPhone
On Sep 13, 2024, at 11:35 AM, Scott Davert via Trainer-Talk <trainer-talk at nfbnet.org <mailto:trainer-talk at nfbnet.org> > wrote:
Passing along FYI.
https://applevis.com/blog/whats-new-ios-18-accessibility-blind-deafblind-users
What's new in iOS 18 accessibility for blind and DeafBlind users
Introduction
Fall is almost here in the northern hemisphere, which means it’s time for another major release of iOS. There are certainly mainstream changes of note, such as a new Passwords app; an expanded Control Center; changes to the Messages app like the addition of support for <https://www.sinch.com/blog/what-is-rcs-messaging/> RCS messaging; and many more. <https://www.idownloadblog.com/2024/08/01/top-ios-18-features/> This article from iDownloadBlog discusses their top 75 features coming in iOS 18. Apple Intelligence was also introduced during the <https://www.applevis.com/blog/wwdc-2024-recap-ai-more-ai-some-not-ai> Worldwide Developers Conference in June. Apple Intelligence features will only be available with the iPhone 15 Pro and Pro Max models along with the iPhone 16 line-up. Additionally, the Apple Intelligence features will not be available with the launch of iOS 18.0; beta testing of Apple Intelligence <https://www.macrumors.com/2024/07/29/apple-releases-ios-18-1-apple-intelligence/> began with iOS 18.1 If your iPhone supports iOS 17, according to this <https://www.macrumors.com/2024/06/10/ios-18-compatible-with-these-iphone-models/> MacRumors article, it will also support iOS 18. The aforementioned article also lists all devices compatible with iOS 18.
While there are many mainstream changes in iOS 18, there are many changes and enhancements to accessibility features for those who are blind and DeafBlind as well. Apple <https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/> announced some of these features on Global Accessibility Awareness Day; while some features were discovered through first-hand use of the software.
As has become standard practice, I've done my best to outline the new accessibility features for blind and DeafBlind users of the updated version of iOS. As has also become standard practice, While I have worked extensively with iOS 18 since the first beta release in June, there will inevitably be things that I have missed. here is what I have learned, and I look forward to learning what others find as they work with this release.
VoiceOver
What's New?
After upgrading your device to iOS 18, if you have notifications enabled for Accessibility Settings, you will be presented with a notification that you can check out what is new with VoiceOver and braille. Activating this notification will take you to a summary of the various changes. However, the details are few in this list as to how each of these features function. Because of this, my readers get to deal with me for at least one more year. You can also find this section by activating the "What's New Link" in Settings>Accessibility>VoiceOver>What's New in VoiceOver....
What's This Thing Do?
iOS 18 comes with an interactive tutorial for touchscreen users of VoiceOver. The very basics are covered in the first lesson with other concepts added as you go along. There are tutorials available for navigation, scrolling, the VoiceOver Rotor, text entry and many more. For new VoiceOver users of the touchscreen, these tutorials are a good place to start to learn how VoiceOver can be used most effectively. You can find the tutorial by going to Settings>Accessibility>VoiceOver>VoiceOver Tutorial. Note that the VoiceOver tutorial is interactive, so some of the standard gestures normally available will not be so during certain parts of the tutorial.
New Speech Features Bring a New Rotor Option
As part of the continued expansion of different TTS options, iOS 18 brings a new Rotor item. This is the Voice Rotor, which also replaces the Language Rotor we've come to know. The basic difference between the Language Rotor found in iOS 17 and the Voice Rotor in iOS 18 is that you can choose any number of speech options in a given language. Prior to iOS 18, users would have to specify what language they desired the chosen voice to be part of. Now, you can freely add as many voices in as many languages as you wish. For example, I conduct technology assessments as part of my job. Many of the clients I work with have a form of both vision and hearing loss. As such, they sometimes have very unique speech needs. I'm able to have all of the U.S. English voices installed on a demonstration device, which I can now more effectively go through with each use case. It is also possible to get to these settings if configured to do so within VoiceOver’s Quick Settings. For further details on VoiceOver’s Quick Settings feature, read more in this post covering <https://www.applevis.com/blog/whats-new-ios-15-accessibility-blind-deafblind-users> what’s new in iOS 15. Voices in the iOS 17 Language Rotor do not carry over to the Voices Rotor in iOS 18. One of the things I had to do when upgrading from iOS 17 to iOS 18 was to choose which voices I wanted in my Voices Rotor. Though the voices I had previously downloaded were there after the upgrade, I had to add them to the Voices Rotor, even though they had been previously in my Language Rotor. To do this, head to Settings>Accessibility>VoiceOver>Rotor and add Voices to your list of available rotor options.
More Variability and More Voices
By navigating to Settings>Accessibility>VoiceOver>Speech, the user will find a familiar menu. The default voice option allows you to modify your default voice, and beyond this the user will find the ability to add new voice options to the Voices Rotor. As noted above, the voices I had chosen to download were still on my device, but I did have to add them from the list of available options. Readers may recall the introduction of a feature called Personal Voice with <https://www.applevis.com/blog/whats-new-ios-17-accessibility-blind-deaf-blind-users> iOS 17. For those who have created personal voices, this will be an option as a VoiceOver voice in iOS 18. I find Personal Voice with VoiceOver to be somewhat less flat than when trying to use Live Speech, but also a little creepy to hear a synthesized version of my own voice. With English TTS options, a premium version of the U.S. voice Zoey was added along with a premium version of the Australian voice known as Lee. Though there don't appear to be other additions in terms of voices, there are certainly new ways the user can modify each voice downloaded. After choosing a voice, there will be some new options available that the user can control independently. In addition to the previous per-voice options for voice customization <https://www.applevis.com/blog/whats-new-ios-17-accessibility-blind-deaf-blind-users> introduced in iOS 17
users can now set a volume minimum and maximum, as well as utilize an equalizer to enhance or suppress certain frequencies. Siri voices can now be customized with presets: default, Even Inflection, Faster Pace, Even Inflection and Faster Pace, or Narration. Another smaller enhancement is that there will be a preview of the changes you are making from within the screen for each individual voice. It's not possible to explain in writing how some of the tweaks can exhibit themselves on each voice; it is recommended that interested users have a listen to Thomas Domville's upcoming podcast demonstrating these changes to get the full auditory experience.
Duck it Your Way
Audio Ducking has been around since <https://www.applevis.com/blog/what-s-new-ios-8-accessibility-blind-low-vision-deaf-blind-users> iOS 8, but has gone largely unchanged since its introduction. The feature has always been either turned on or off. iOS 18 brings a more versatile experience; allowing users to choose whether Audio Ducking is off, only active when speaking, or always active.
Go Beyond 100%
Speaking of audio enhancements, it is now also possible to raise the VoiceOver volume beyond 100%. The reason for having this is so that you can control the speech-to-other-noise ratio more effectively. How this works is that if you set the volume of VoiceOver speech to 115%, for example, then the audio from VoiceOver will always be 15% higher than whatever other noises are on your phone. To set this, the user needs to go to Accessibility>VoiceOver>rotor and add volume control to the list of available rotor items.
Delays are Sometimes Good
One of the frustrating things sometimes about using the touchscreen is that it is easy to bump it which can then move focus from where you want it to be. With iOS 18, it is now possible to control how long of a duration must happen before a touch is processed. This setting can be found under Settings>Accessibility>VoiceOver>Delay Before Selection. By default, this is set to 0 seconds, but you can increase it to a higher amount if you wish.
Start VoiceOver with Confidence
If Haptics are enabled, VoiceOver will now confirm through a vibration when it has started. This is particularly helpful for those users who cannot hear speech to know that VoiceOver is running. It can be toggled by going to Settings>Accessibility>VoiceOver>VoiceOver Sounds & Haptics.
New languages
VoiceOver adds 2 new supported languages in iOS 18: Lithuanian and Kazakh. You can find the new voices for these languages under Settings>Accessibility>Speech>Add Rotor Voices. These languages also have braille tables available, which can be added by visiting Settings>Accessibility>VoiceOver>braille>Braille Tables>Add Braille Table.
Live Recognition Comes to the Rotor
When <https://www.applevis.com/blog/whats-new-ios-16-accessibility-blind-deafblind-users> iOS 16 was introduced, one of the new features was called detection. It was available through the Magnifier app at that time. iOS 18 gives users the option to add these features to the Rotor. The amount of things detectable through Live Recognition has also expanded to include furniture and scene descriptions. After adding Live Recognition to the rotor, the user can navigate to this and then swipe up or down with 1 finger to change the type of recognition. Double tap the option desired to toggle it on or off. I found the scene description feature to be interesting, but as a braille user, found it difficult to hold my phone, braille display and cane when traveling. However, as a speech user, this is another tool that has been expanded to include more access to more information. As was the case in iOS 17, the user can perform a 4 finger triple tap to activate Live Recognition as well.
Braille Screen Input Gets New Gestures and Major Upgrades
iOS 18 contains several noteworthy changes for Braille Screen Input (BSI), a feature that allows a user to type in braille on their touchscreen. It is my view that iOS 18 may be the biggest upgrade yet for users of this feature. One of the many changes to Braille Screen Input in iOS 18 is that you can invoke BSI from anywhere by double tapping with 1 finger at each edge of the screen. Before this will succeed, though, it may be necessary to enable this gesture. Head to Settings>Accessibility>VoiceOver>Braille>Braille Screen Input>Use Activation Gestures and turn this on. To exit Braille Screen Input, slide 2 fingers toward each other. If one still prefers to use the rotor to activate/deactivate BSI, this is still an option.
BSI in Any Text Field
Continuing to traverse the BSI menu, there is a new option which will start BSI whenever a text field is encountered in Braille Entry Mode. I found this to work as expected and that it would almost always follow me from one text field to another. For those who use BSI heavily, this option can save a large amount of time since BSI is always ready when you are.
BSI Takes Command
For many years, braille display users have enjoyed a lot of shortcuts to support navigating the operating system. There are many more options for shortcuts available on a braille keyboard than there are gestures. iOS 18 brings a similar functionality to the BSI user base with a new feature called Command Mode. To activate Command Mode, while BSI is active, swipe left or right with 3 fingers until locating this mode. With the BSI gestures enabled as outlined above, you can also triple tap with 1 finger at each end of the display to activate this mode. When enabled, instead of writing text, the device will interpret a Space with the gesture which turns it into what the AT industry has called Chords. For example, pressing dot 4 in this mode will produce the command Space with dot 4. This command is used to perform the touchscreen equivalent of swiping right with one finger. There are many commands available. To familiarize yourself with the possibilities, this webpage lists <https://support.apple.com/en-us/118665> a large set of commands to start with. Regarding the Rotor in Command Mode, swiping left will take you to the previous Rotor option while swiping right with one finger will move the Rotor to the next option. To move by that specific Rotor item, swipe up or down. There are also options to add commands to BSI itself. If you do not wish to exit command mode automatically, there is another menu option called Keep Active Until Dismissed. There is also an option called Visual Text Feedback. When enabled, each time a chord is entered, the text will be translated on the device's screen.
Quickly Launch Apps Using Command Mode
When on the Home Screen, and when BSI is active in Command Mode, you can type the first few letters of an app and then swipe right with 2 fingers to launch that app. The more letters you type, the more accurate the result. For example, typing "spot" on my Home Screen first brings up an app called Spot The Station. However, if I type "spoti" and then flick right with 2 fingers, this would launch the Spotify app.
Give Me Some Feedback
When using BSI, VoiceOver can provide both audio and haptic feedback while typing. Under the Typing Feedback heading, there are options for both. There is also a setting for mode announcements where the user can have this information verbalized, a sound played, or both each time the user switches modes. It is also now possible to add any braille table that VoiceOver has access to for BSI.
A New Cursor Movement and Text Selection Method
Another new feature in BSI is a different method of moving the cursor and selecting text. These gestures require that the user keep one finger held down on the touchscreen. While doing this, the user can move the cursor around by swiping left and right with 1 finger. To specify whether you wish to move by character, word or line, swipe up or down with 1 finger to adjust. To select text by the type of movement specified, swipe left or right with 2 fingers. Remember, it is necessary to also have a finger held down on the other hand while performing these gestures.
Braille Display Users Get Some Love Too
Reconnecting
As anyone who has ever tried using a braille display via Bluetooth for any great length of time can tell you, the connection is not always stable. For a blind user who can access their device via speech, reconnecting the braille display manually may be an inconvenience, but not a showstopper. For an individual who is DeafBlind, however, there may not be any visual or auditory information available. iOS 18 brings a command which will attempt to reconnect your braille display to your iOS device. it does need to first be set up by assigning a gesture to this function. To do so, go to Settings>Accessibility>VoiceOver>Commands>braille>Reconnect Braille Display, and then select assign new Add Gesture and assign the command a gesture of your choice. For example, I don't use Grouped Navigation, so I do not need the gesture to move in or out. So I chose the 2 finger swipe left and found that it did allow my display to reconnect successfully, though this gesture may not be the most appropriate for some users.
Searching for Something?
Apple added the ability to launch apps quickly on a braille display in iOS 17 by pressing Enter while on the home screen and then typing the name of the app you wish to launch; iOS 18 brings this functionality anywhere. Pressing Enter will now take you to items with the text you are searching for when available. This works similarly to Space with F, the VoiceOver find function; though I've found that the VoiceOver Find function will place me exactly where I want to be on a web page, whereas that VoiceOver will only place me on the line where the text was found using the new method. It's another option that works sometimes when Space with f doesn't.
More Variable Input and Output
Many iOS versions ago, Apple separated braille input from output. What that meant was that users could write in one code, such as uncontracted braille, while reading in contracted braille or vice versa. However, this required the user to have a table that supported the code that was developed. So if you wanted to, for example, transition from the old standard before 2016 in North America to Unified English braille, you were not able to read in one table and write in another. It is now possible to use any added braille table as your input and an entirely different one for output. Since these are now separate rotor options, it is also faster to switch tables. This also means that, Space with dots 2-3-4-5 will allow you to change your output through all of the installed tables, and Space with dots 2-3-6 will cycle through the input options. If the tables are configured to change in sync, either of the previously-mentioned commands will change both the output and input table.
Now Being a Cursor Doesn't Require Blinking
I'm not referring to a foul-mouthed Scott Davert, but the cursor on the braille display which is represented by dots 7 and 8. Though this isn't a specific setting for braille display users, it impacts the braille experience in the same way it does for those with vision. It is now possible to turn off the blinking cursor. If this is your preference, you can head to Settings>Accessibility>Motion>Prefer Non-Blinking Cursor and turn this off.
Multi-Line Braille Support
Though the “what’s new in VoiceOver” section makes reference to the support for multi-line displays, I was unable to get the <https://www.orbitresearch.com/product/orbit-slate-520-multi-line-refreshable-braille-display/> Orbit Slate to display more than one line of braille at a time. Since it’s the only multi-line display I have access to, I’m unable to cover any further details in this section.
Low Vision
As I often write, I'm not a low vision user and do not have any form of visual frame of reference other than the things AI tells me. The following information was garnered from discussions about low vision users and talking to other colleagues who are more familiar with this part of accessibility. Any errors made below are mine. I would strongly encourage anyone, but especially low vision users, to examine iOS 18 on another device before upgrading.
Hover Over These Words
One of the issues I've encountered while training those with low vision is the lack of ability to visually orient to the keyboard onscreen. A new feature called Hover Typing may further support those low vision users with different typing needs on the touchscreen. It can be found by going to Settings>Accessibility>keyboards & Typing>Hover Typing. Hover Typing will display larger text while typing. The user can also choose where this text will be displayed. By default, the text will dock with the on-screen keyboard. If that is not the preference, there are also options to have the displayed text at the top of the screen, or the user can also have the typed text in-line with the typing itself. When choosing the option of Inline, the text will be displayed in a box above what you are typing. It does not change the actual text; it only displays what you are typing in the box for easier viewing while typing. It is possible to control the text size, font, and color. The user can control the color of the insertion point, background, overall text color, and the border color. It's also possible to adjust the color for representing misspelled words and autocorrected words. There are quite a few options available which will hopefully bring a more colorful and comfortable typing experience visually.
Reading Mode
One of the upgrades in the iOS 18 Magnifier app is the addition of Reader Mode. This feature can convert scanned text from images into raw text. After opening the Magnifier app on iOS 18, the user will need to acquire an image that the Reader Mode can utilize. Once this happens, “Reader Mode Available” is temporarily displayed on the screen and you can tap the Reader Icon. After tapping on the Reader button, select the text icon on the screen. Once Reader Mode is activated, the text acquired from the image will be displayed in a larger font. There are also options to listen to the acquired text. For this feature, the user will need to tap on the Text Formatter button, then tap Listen. By default, as each word is spoken, iOS highlights it visually. The user can also adjust the speaking rate.
Detect the Action
For those who utilize the Magnifier's Detection Mode and who have a device with an Action Button, one can activate detection mode by simply pressing this button, once it is configured accordingly. You can set it up by going to settings>Action Button. It’s important to note that, like Live Recognition, this feature is very responsive and you are not taking still photos.
Dimming it Down
For some users, brightly flashing lights during videos can be annoying, but can cause access issues for others. iOS 18 brings the ability to dim these bright lights. head to Settings>Accessibility>Motion and then turn "Dim Flashing Lights" on to take advantage of this functionality.
Hearing
New Background Sounds
Upgrading to iOS 18 will give the user 2 new background sounds to choose from. Night and Fire are the 2 new options. Find them under Settings>Accessibility>Audio & Visual>Background Sounds.
Feeling the Music
For those who wish to add an element to their music beyond the auditory, there is a feature in iOS 18 which will allow the user to take in the music through haptics. It can be located by going to Settings>Accessibility>Music Haptics and turning it on. At launch, the only app supported is Apple's own Music app. As a Spotify subscriber, I'm hoping they get it together and support this feature.
Another Year, Another New Set of Features for Siri
Only Siri Can Interrupt
Sometimes, people in loud environments have noises which triggers Siri to listen, even as it responds to questions. iOS 18 brings the setting which makes it so that the user must say "hey Siri" or "Siri" to interrupt Siri's responses. Find this option under Settings>Accessibility>Siri and then enable Require "Siri" For Interruptions.
Siri Can Understand More Speech
While in this Siri menu, there is another option which may be helpful to some called "Listen For Atypical Speech." A friend of mine from the Carolinas who has a rather strong accent reported to me that Siri was able to recognize his voice much better than in iOS 17. I wouldn't consider a southern accent to be "atypical" but he reports "decent improvements" when dictating.
Vocal Shortcuts
Another handy feature in iOS 18 is the ability to add vocal shortcuts to invoke specific actions. This feature allows you to set up Siri shortcuts with custom phrases. After deciding on a unique phrase To train the iPhone to listen for, you will need to repeat it 3 times. While iOS will listen for these shortcuts most of the time, it will not do so while the screen is dimmed unless the phone is connected to a power source.
Conclusion
Whether your access modality is through speech, braille or print, Apple has made some improvements to the user experience in iOS 18. The new mainstream functions are welcomed changes, but the new accessibility features are just as plentiful. Like I've written in so many ways since the first edition of this post with iOS 5, whether you should upgrade or not depends on your specific use case. I would recommend checking out AppleVis' list of new and fixed bugs prior to doing so. iOS 18 is a free download that will be available for all supported devices on September 16. if needed, More information on how to update the software on your device is available on <https://support.apple.com/en-us/118575> this Apple Support page.
Sent from my iPhone
_______________________________________________
Trainer-Talk mailing list
Trainer-Talk at nfbnet.org <mailto:Trainer-Talk at nfbnet.org>
http://nfbnet.org/mailman/listinfo/trainer-talk_nfbnet.org
To unsubscribe, change your list options or get your account info for Trainer-Talk:
http://nfbnet.org/mailman/options/trainer-talk_nfbnet.org/jayjohnson66%40me.com
More information about the Trainer-Talk
mailing list