<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body dir="auto"><div dir="ltr"><base href="https://www.applevis.com/blog/whats-new-ios-17-accessibility-blind-deaf-blind-users"><style id="print"></style><title>What's New in iOS 17 Accessibility For Blind and Deaf-Blind Users | AppleVis</title><div class="original-url"><br><a href="https://www.applevis.com/blog/whats-new-ios-17-accessibility-blind-deaf-blind-users">https://www.applevis.com/blog/whats-new-ios-17-accessibility-blind-deaf-blind-users</a><br><br></div><div id="article" role="article" style="-webkit-locale: "en"; text-rendering: optimizelegibility; font-family: -apple-system-font; font-size: 1.2em; line-height: 1.5em; margin: 0px; padding: 0px;" class="system exported">
<!-- This node will contain a number of div.page. -->
<div class="page" style="text-align: start; overflow-wrap: break-word; max-width: 100%;"><h1 class="title" style="font-weight: bold; font-size: 1.95552em; line-height: 1.2141em; margin-top: 0px; margin-bottom: 0.5em; text-align: start; hyphens: manual; display: block; max-width: 100%;">What's New in iOS 17 Accessibility For Blind and Deaf-Blind Users</h1><p style="max-width: 100%;">It's difficult to believe that iOS will be turning into a legal adult next fall, though the iOS 17 upgrade is far from being minor. Many new features and functions will be available for everyone. To check out some of the mainstream changes, <a href="https://www.apple.com/ios/ios-17-preview/" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">Apple's official iOS 17 preview page may help.</a> Alternatively, our main post announcing iOS 17 offers a large list of mainstream enhancements as well as a list of changes in accessibility. Not all devices which supported iOS 16 will support iOS 17; for a list of devices which will, <a href="https://www.cnet.com/tech/mobile/apples-ios-17-wont-work-on-every-iphone-these-models-will-get-the-boot-on-monday/" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">check out this CNET article</a> The goal of this article is not to repeat the information widely available from other sources, but to look at it in a bit more detail where new accessibility features are concerned. I would like to thank my colleague Juan Ramos for his contributions to the article from a visual perspective.</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">Mainstream Changes</h3>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Siri Again</h4>
<p style="max-width: 100%;">The VoiceOver iterations found in iOS 17 now have the ability to be sped up or to be slowed down. Also, the word "hey" is no longer necessary when summoning Apple's virtual assistant. Finally, it is possible to speed up and slow down the speech rate of Siri when it is not being utilized with VoiceOver. This setting is found under Settings > Accessibility > Siri, and is in the form of an adjustable slider.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Transcription of Audio Messages</h4>
<p style="max-width: 100%;">iOS 17 also brings the ability to have your audio messages transcribed as well as any type of voicemail. While the transcripts are useable with braille and VoiceOver for voicemail, transcripts of audio messages are currently not useable with VoiceOver. Once you receive and audio message, you can select it and then find the "more" button. This will bring up a transcript of the audio message on screen. However, VoiceOver still reports "Transcript not available".</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">VoiceOver</h3>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Get Notified Only When You Want</h4>
<p style="max-width: 100%;">For many years, VoiceOver users have had the option of whether they prefer to have VoiceOver always speak notifications or not. Even though the setting was there, if I had speech on when this was set to off, notifications were still being read out loud. In iOS 17, this has been improved and expanded to include several options. To check them out, head to Settings > Accessibility > VoiceOver > Verbosity > System Notifications. Here you will find options to set how you would like VoiceOver to handle notifications in different situations. When the screen is locked and a notification arrives, VoiceOver is now able to speak the notifications as they arrive or speak the amount of notifications since the device was last unlocked. It is also possible to show these notifications in braille only, or to have the device do nothing. One can also configure what happens with banner notifications in this submenu. The options are to speak, play a haptic, braille, and do nothing. Finally, it is possible to silence the output of VoiceOver when notifications come in altogether by turning the Ring Switch on or off.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">More Speech Customization</h4>
<p style="max-width: 100%;">iOS 17 also brings new speech options to all supported VoiceOver voices. These new settings can be found by navigating to Settings > Accessibility > VoiceOver > Speech and then selecting the language and desired TTS voice. After selecting the voice, flick up or down with one finger, or press space with dot 3 or dot 6 on a braille display, to find a new feature called Per-Voice Settings. The available options in Per-Voice Settings vary based on the synthesizer. For example, Vocalizer voices offer options for customizing sentence pause and timbre; while the Alex voice offers customizations for pitch range and WPM minimum and maximum. In all instances, Thankfully, included in Per-Voice Settings is a "Reset to voice defaults" button in case the changes you have made are not what you prefer. All screens of the per-voice settings also have a "preview" button, so it is possible to listen to what you have created to determine whether you've created a monster, a perfect sounding voice for your needs, or a monstrously good sounding voice to your ears.</p>
<p style="max-width: 100%;">In <a href="https://www.applevis.com/blog/whats-new-ios-16-accessibility-blind-deafblind-users" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">iOS 16,</a> Apple added the Eloquence text-to-speech engine to its list of available speech options. With the 17th iteration of iOS, many options have been added to the Eloquence voices. After navigating to Settings > Accessibility > VoiceOver > Speech > Voice > Eloquence > the preferred voice and then flicking up twice, the user will find an option called "Open per-voice settings. Among the options, the user can now configure: Rate Multiplier, head size, pitch, pitch range, breathiness and roughness. It is quite difficult to explain each of these parameters in writing, so I would advise those with interest to check out the AppleVis podcast done by Thomas Domville which demonstrates these features. It should be available shortly after the release of this article. Also on this screen is a toggle for a higher sampling rate. Other new options for Eloquence include phrase prediction, whether you would like the abbreviation dictionary to be in use, and the community-driven pronunciation dictionary.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Predictably Better!</h4>
<p style="max-width: 100%;">This is true, at least where it concerns how VoiceOver will communicate the presence of predictive text. One can locate these settings by going to Settings > Accessibility > VoiceOver > Verbosity > Predictive Text Feedback. There are 2 new series of options: one is for communicating when predictive text appears, and the other will communicate when predictive text has been successfully entered. With both adjustable options, you have the ability to have the information communicated by playing a sound, speaking, through braille, by changing the pitch of the speech, any combination of those options, or to do nothing. It may be worth noting that some VoiceOver users report that recommendations are silent in certain apps, even though they have selected speak. Others have found that with the "predictive text" setting configured to not speak recommendations, VoiceOver will still read out the recommendations in some apps like Mail. So these options are a great idea, albeit a work in-progress.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">New Haptic Feedback</h4>
<p style="max-width: 100%;">In iOS 17, haptic feedback is used by VoiceOver to communicate more things. For example, when the Lock Screen is open and the display goes to sleep, the user will feel a gentle Haptic as well as hearing the high-pitched sound to indicate the display has dimmed. Haptics are also now played when an element is activated, similar to the familiar sound that has accompanied element activation since the very early days of iOS. There is now also an option to customize haptic intensity. To customize all things related to VoiceOver haptic feedback, go to Settings > Accessibility > VoiceOver > Audio > VoiceOver Sounds & Haptics.</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">Braille</h3>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Connection Concerns</h4>
<p style="max-width: 100%;">It has been reported by <a href="https://www.applevis.com/forum/apple-beta-releases/rcs-are-now-available-ios-17-macos-14-watchos-10-more" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">several users</a> that the connection process in the public release of iOS 17 has become a bit unpredictable. In this author's experience, there are significant connection issues with the Focus displays as well as the Braille Sense 6, though both the Brailliant BI 20X and Mantis Q 40 connect very reliably. Other users have reported no issues with the Focus, so your experience may vary.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Sound Off With Sound Curtain!</h4>
<p style="max-width: 100%;">There are quite a few new options for braille users in iOS 17, some of which are only applicable to the braille experience. For example, one new feature is called Sound Curtain. Much like Screen Curtain darkening the screen, Sound Curtain mutes all sounds other than emergency alerts. To enable Sound Curtain, move to Settings > Accessibility > VoiceOver > braille and turn the switch. If you are sure that's what you want to do, you will be asked to confirm. Music, VoiceOver speech and other sounds from your iPhone will be silent. This can be helpful to many, but especially for those braille users who do not have any hearing to tell when their speech is on. Sound Curtain brings more assurance that the user's speech and sounds are all turned off. Like the Screen Curtain, Sound Curtain doesn't turn the sounds off, it covers them up. For the moment at least, turning speech on will still cut into hearing aids that have VoiceOver set to go to them, but it will be silent. If you have hearing aids connected and do not wish to have your hearing interrupted if you accidentally turn on speech, it has been my experience that routing audio to the internal speaker of the iPhone resolves this issue.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Synchronized Startup</h4>
<p style="max-width: 100%;">Continuing in the braille submenu, there is also a new feature which will allow the user to turn Bluetooth on automatically whenever voiceOver starts. This means that if a user turns Bluetooth off by accident, for example, restarting VoiceOver will turn it back on. The good news is that this is a toggle, so the user can turn it on or off as needed.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">No Naps in iOS 17</h4>
<p style="max-width: 100%;">Traveling to Settings > Accessibility > VoiceOver > Braille > [your braille display model] > more info, you will encounter a new setting which allows you to control whether a braille display stays connected when your phone is locked or not. It is on by default, which is the behavior we have sene in the past, but always keeping the braille device in range and connected insures there are less Bluetooth connection issues. The draw-back, of course, is that having a braille display always connected can drain the battery quite quickly.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Faster Launching for Braille</h4>
<p style="max-width: 100%;">iOS 17 also brings a new way to launch applications from the Home Screen in braille. To use this, from the Home Screen, press dot 8, or space with dot 8 if in 8 dot braille. Then type the app you are looking for; Then, press dot 8, or space with dot 8 when in 8 dot mode. For those using either contracted or uncontracted braille, a list of matching apps will appear with a full cell at each end of the display. Move through the results of matching items by pressing space with dot 1 or space with dot 4. When you find the app you want, press dot 8 again or a cursor routing button if you prefer--and the app will open. If you wish to cancel your search without launching an app, you can press space with b to return to your Home Screen.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">New Commands!</h4>
<p style="max-width: 100%;">There are a couple of new commands available for braille display users as well. Navigate to Settings > Accessibility > VoiceOver > Braille > [your model of braille display] > more info > Commands and under the "keyboard" category, one of the new options is to toggle text selection. Though it appears this function mostly works as it should, it is a bit challenging to use, since VoiceOver does not often announce in any way when Text Selection is turned on or off.</p>
<p style="max-width: 100%;">There have always been commands to go to the first item, last item and so many more choices. iOS 17 adds a command to move to the center of the screen. This seems to work as advertised, and if you know the layout of an app, this tool can also be a productivity gainer.</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">Low Vision</h3>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Freeze Frame!</h4>
<p style="max-width: 100%;">For those who are sensitive to rapid animations, you can now control the frame rate minimum and maximum of content. Check it out under Settings > Accessibility > Motion > frame Rate. it's also possible to turn these images off in both Safari and Messages. To do so, head over to Settings > Accessibility > Motion > Auto-Play Animated Images and turn them off altogether if that is preferred.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Grayscale Gets Mor Variable</h4>
<p style="max-width: 100%;">If you are a user that has some type of color filtering enabled, there is a new setting which allows the user to control the intensity of the Greyscale on any of the color filters which already exist. Previously when Grayscale was applied it was intensified to a maximum level. Now you are able to control the amount of Grayscale applied to your screen. Head on over to Color Filters under Display & Text Size in Accessibility. Enable Color Filters, tap on Grayscale to apply and now scroll towards the end of all the filters. You will notice a heading labeled, “Intensity.” Use this to apply less or more intensity. By default, it is set to 100%. </p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Point and Speak</h4>
<p style="max-width: 100%;">Available on the Pro and Pro Max models of iPhone 12 and later, Point And Speak allows the user to identify text by physically pointing at it with their finger. It can be found in the Apple Magnifier app as part of Detection. Point And Speak allows you to move your finger around the view finder and if lined up properly, it will identify the text closest to your finger. It is useful for when working with things such as keypads with touchscreens or appliances. You can also point to a specific section of text and have that spoken as well. Like the other function of Detection Mode <a href="https://www.applevis.com/blog/whats-new-ios-16-accessibility-blind-deafblind-users" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">Discussed in the iOS 16 article,</a> this feature requires more hands than a deaf-blind person typically has. As a speech user, I find that Point And Speak is useable, but I had trouble, for example, figuring out how to use it with a microwave or air frier. While one hand holds the phone about a foot from the back facing camera, the other hand must point at the text. This takes some coordination and guesswork without any visual reference point, but can be done. For braille users, even if you have a 3rd hand, my testing found that the output sent by VoiceOver is not helpful. Instead of getting a flash message on the display of what is verbalized, the braille user reads "[speak]{prosody}." Though this technology on a phone may be extremely challenging for braille users, it is highly likely that it would be of much higher value in a head-worn device such as the <a href="https://www.apple.com/apple-vision-pro/" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">Apple Vision Pro</a> as one could point with a hand and use the other to read their braille display.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">All of the Text Detected</h4>
<p style="max-width: 100%;">The other new functionality the Magnifier gains in iOS 17 is Text Detection Mode. Unlike Point and Speak, this will work similarly to the live text option in the Photos app. It will read any text that is detected within the viewfinder. This can help in certain circumstances, where Point and Speak may not be helpful such as when trying to read a document or sign.</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">Hearing</h3>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">A New Form of Control</h4>
<p style="max-width: 100%;">For those cochlear implant or hearing aid wearers that use an <a href="https://support.apple.com/en-us/HT210386" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">mFI compatible</a> device, there is a new way to more quickly take control of what you will find in the Hearing Control Center. To check out the new options for yourself and set it up, move to Settings > Accessibility > Hearing Control Center. The options themselves aren't exactly new, but what is new is the ability to control which of the available options you prefer to be present when selecting the "hearing devices" option in your Control Center. This can speed up productivity, since you can remove the options you don't want. For example, I would prefer not to have the Background Sounds in my Control Center options since I do not often utilize that feature. Now, those who want it can have it, while I can remove it.</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">Speech Features</h3>
<p style="max-width: 100%;">Speech has now earned its own category which has 2 new features. Live Speech, available on all devices supporting iOS 17, and Personal Voice, which is only supported on the iPhone SE 3 as well as the iPhone 13 and later models. Each will be discussed in turn.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Live Speech</h4>
<p style="max-width: 100%;">Live Speech allows the user to utilize a chosen voice which may support them in communicating over video calls or in-person. After enabling Live Speech under Settings > Accessibility > Live Speech, there will be options to set up favorite phrases and to also choose the voice you would prefer to use. All of the voices available in VoiceOver are options. After setting it up, you can then use the Accessibility shortcut to launch Live Speech. The icon will be located below the status bar. For braille users, you will need to find it by pressing space with dots 4-5-6 twice. After activating Live Speech, there is a part of the screen which allows you to toggle between your favorite phrases and a keyboard. Moreover, if you use multiple keyboard languages, you can change the voice output to another language by toggling between the keyboards you have enabled. For example, if English US is your default and you added Spanish US as a secondary keyboard, switching to the Spanish keyboard will make the output speech switch to the voice you have set for that language. The voices do not seem to have the same inflection that is found when using VoiceOver or the voices created for Siri, but it's a potential tool that could come in handy.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Personal Voice</h4>
<p style="max-width: 100%;">iOS 17 also brings a new feature for users of the iPhone SE 3 and later called Personal Voice. This feature can be used in conjunction with Live Speech to allow an individual who has preserved their voice to continue using a cloned version of it in phone calls. However, for conversational purposes, the output of the speech is a bit flat. To Set up a Personal voice, with one of the phones specified above, go to Settings > Accessibility > Personal Voice, and follow the prompts. You will be given several hundred phrases to speak into your iPhones microphone. If iOS detects that your audio level is too low or too high, it will also inform you of this. Though it has been written that this process takes roughly 15 minutes to complete, it took me around 30 minutes. After the phrases are recorded, you will then need to leave your screen locked and preferably, connected to electricity. One can still use their phone while the voice is created, but it does take several hours to complete. After setting your newly created Personal Voice as the output under Live Speech, it is then possible to begin typing and pressing return or enter to have the text spoken. Though the clone of my voice does seem to <a href="https://www.dropbox.com/scl/fi/lb77gfyk6tiwc0m9r8lzv/Scott_Davert.wav?rlkey=2fm0lbtsq1nz7qm89cg3jomu3&dl=1" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">sound</a> like me, it is my hope that it can become more animated with future updates. I have typed the following into the text field exactly as follows. "Hello, this is Scott Davert. Or is it really a fake version of me? I HAVE NO IDEA!!! Do you???" Though it is not entirely flat, I would have expected a bit more inflection from a sentence written in all caps and with 3 exclamation points. That said, this version of voice cloning is free, while the more advanced models of cloned voices require a <a href="https://elevenlabs.io/pricing" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">monthly subscription</a> and limit you to the number of characters in text each month. Personal Voice is free, and there doesn't appear to be a limit to the amount of usage.</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">Other New features</h3>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Assistive Access</h4>
<p style="max-width: 100%;">Assistive Access allows for the customization of certain apps. What can be configured will depend largely on the app itself. This function allows the user to configure which apps will be included with Assistive Touch and also provides the ability to add only certain apps to a Home Screen. Assistive Access has been set up with a form of a start up wizard which will guide you through the process. After setting up how icons should appear, the user is presented with a list of apps that they can choose to include on the Home Screen. The configurability of each app is highly contextual. For example, with Messages, one can limit the user of the iPhone it is being set up on to only contacting and showing messages from certain contacts. There are also options for hearing messages you tap being spoken, whether the user guess details like the status of a message and when it was sent, and the ability to limit input to the keyboard, video selfie or to use emojis. The options for calls are somewhat similar to the Messages app, such as the ability to control both received and incoming calls from specific contacts, though it also offers the ability to hide the keypad. In calls, you can also show or hide the keypad, and control whether the speaker will take the call or not. It is also possible to set up a passcode so that one can be assured they will not exit the Assistive Access mode unintentionally.</p>
<p style="max-width: 100%;">Once the user has configured Assistive Access to their preference, they can save the changes and then use the Accessibility Shortcut to enable and disable it. It is my hope that this type of customization could be continued with other types of features considered. For example, I work with many individuals who are deaf-blind who may only wish to learn how to text and use Mail. It would be helpful in some situations to be able to set up the Messages app to only show the Back Button, keyboard, history and send as an option. This would cut down on the unnecessary clutter a slow braille reader must navigate through to get to what they wish to do. It's also worth noting that some testers are reporting that Assistive Access makes the device run more slowly. Swiping or tapping takes a few seconds to respond, Even more so when VoiceOver is enabled.</p>
<h4 style="font-weight: bold; font-size: 1em; margin: 1em 0px; max-width: 100%;">Voice Control Gets Guides!</h4>
<p style="max-width: 100%;">For those who use Voice Control, there have been some frustrations since some users weren't aware of all of the things this feature can do. With iOS 17, there are now all kinds of guides which can help the user learn how to most effectively make use of Voice Control. These are done from the perspective of an individual who does not use VoiceOver, so if you are a VoiceOver user, you will need to take this into account.</p>
<p style="max-width: 100%;">There has also been a new feature added which allows the user to differentiate between words which sound the same; for example, there, they're and their.</p>
<h3 style="font-weight: bold; font-size: 1.25em; max-width: 100%;">Conclusion</h3>
<p style="max-width: 100%;">Like always, Apple continues innovating for all users to promote a more inclusive society. For inclusion to happen, communication needs to be accessible for all; and Apple has once again taken steps toward that goal through the new features and enhancements in iOS 17. Whether you should upgrade or not depends on your specific use case. I would recommend checking out <a href="https://www.applevis.com/blog/ios-17-ipados-17-accessibility-report-voiceover-braille-low-vision-issues-improvements" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">AppleVis' list of new and fixed bugs</a> prior to doing so.</p>
<p style="max-width: 100%;">iOS 17 is a free download available for all supported devices. if needed, More information on how to update the software on your device is available on <a href="https://support.apple.com/en-us/HT204204" style="color: rgb(65, 110, 210); max-width: 100%; text-decoration: underline;">this Apple Support page.</a></p></div></div></div><br id="lineBreakAtBeginningOfSignature"><div dir="ltr">Dontee L. Wrenn</div></body></html>