[NFBCS] FW: [tech-vi Announce List] Apple’s Splintered Accessibility. Why I use Android and Windows | by Devin Prater | May, 2022 | Medium

Curtis Chong chong.curtis at gmail.com
Wed May 25 15:25:00 UTC 2022


Greetings:

 

I thought you folks might find the below article of some interest.

 

I am by no means an expert on using the Apple Macintosh. I can muddle my way through it if I absolutely have to.

 

Cordially,

 

Curtis Chong

 

From: tech-vi at groups.io <tech-vi at groups.io> On Behalf Of David Goldfield
Sent: Wednesday, May 25, 2022 3:56 AM
To: List <tech-vi at groups.io>
Subject: [tech-vi Announce List] Apple’s Splintered Accessibility. Why I use Android and Windows | by Devin Prater | May, 2022 | Medium

 


https://medium.com/@r.d.t.prater/apples-splintered-accessibility-f0d95b100c42


Apple’s Splintered Accessibility


Why I use Android and Windows


 <https://medium.com/@r.d.t.prater?source=post_page-----f0d95b100c42--------------------------------> Devin Prater

Apple, for now, is the leader in mobile accessibility. iOS is a really nice system to use, if you like walled gardens and the occasional persistent bug or two. I’d say about 90% of all blind people that have a phone, own an iPhone. However, Apple doesn’t just want you on their phone. They have an entire ecosystem ready to go, to make having an iPhone, Apple Watch, and Airpods even more worth it. Since the iPhone was so nice, people started investing in Mac computers, starting after the release of Tiger and Leopard. Things really picked up around 2010 or so.

But now, those 90% of blind people don’t own a Mac, they own a Windows computer. In this post, I’ll explore some of the reasons why Apple’s ecosystem just isn’t sticking so easily for blind people.

When I use the letters VO in a keyboard command, that means hold down either Caps lock or Control + Option


VoiceOver, a Fall into Disrepair


The Mac was the first Apple technology to be graced with VoiceOver, the screen reader that gives blind people access to Apple devices. VoiceOver started a few really nice trends in screen readers, like sounds to show where objects are on the screen, great-quality voices, and reading based on the objects from the accessibility API’s rather than using mostly system navigation. We’ll talk about the pitfalls of that in a while.

After a while, the iPhone came along, and VoiceOver, which already had support for the Mac trackpad, gained touchscreen support on the iPhone. To this day, VoiceOver is the only screen reader that has support for using the trackpad for navigation. Even though those features are cool, though, they aren’t enough to keep people on a Mac.


Object-based Accessibility


In VoiceOver, everything is an object. A paragraph on a website isn’t read line by line, but all at once, as a “chunk” of text. Apple have tried to work around this by letting users read line by line through a web page, and select text like their in a document, but this has some downsides and is a patch onto an already-established system. Within apps, this usually works fine. On the web, however, there are some nasty complications.

Let’s say you’re reading an article, in Apple’s reading view. You press VO + a to begin reading, but somewhere down the article, VoiceOver gets stuck. You may have to press Control a few times to get VoiceOver unstuck, or turn it off and back on.

This is the problem with object-based screen readers: if something changes and they aren’t aware of it, or they don’t ask about it, something is going to break. In screen readers with virtual buffers that store the contents of web pages, there is something they can fall back to if the web document breaks accessibility. VoiceOver, and screen readers like it, trust the browser and web developer, so sit too close to the document to be able to ignore those kinds of problems.


The Interaction method


Since VoiceOver reads one object at a time, it can be really tedious to move through a table one cell at a time, especially if you don’t care at all about that table. So, VoiceOver has a method to skip over container objects. As you navigate through objects, you’re on, let’s say, level 0 of the interface, the top level. Most apps have a toolbar near the top of the screen. If you want to see what’s in it, you must go to level 1, inside that toolbar. There, you can only access the objects inside that toolbar. When done exploring the toolbar, you can come out of it, back to level 0, and continue exploring the main interface.

This is a nice concept, but it has its downsides. First, a lot of things are containers. A book in the Books app is a container, which I suspect has something to do with how awful it is to read books on the Mac. I’ll get into that later though. Second, there is no limit to how deep containers can go. XCode is terrible about this, having the user drill through a good 4 levels of interaction before you even come to the code text field.


Using the Terminal


I still don’t understand how this is still a thing. It shows either that Apple’s developers don’t use VoiceOver, or that they never have to use the Terminal. Using the Terminal with VoiceOver is really embarrassing. I’ll list some of the bugs below.

*	Output is interrupted, so users only get a sense of lots of text scrolling by, if a lot of text is output.
*	There is no cursor tracking, so using a program like Nano, VIM, or Mutt is practically impossible.
*	If the screen refreshes, the whole screen is read over again. And again. And again. And again.

If a lot of blind people didn’t buy into the Mac in the early days, after they saw how good the iPhone was, the story would end there. MacOS would be just as poorly supported by the community as Linux. Fortunately, someone created a terminal screen reader that people can use in place of VoiceOver. That will be discussed in the “community support” section.


MacOS Keyboard Commands


When I use a Mac with VoiceOver, I’m reminded of just how many keyboard commands VoiceOver has to include to make the Mac accessible. >From entering system dialogs to moving items around on the screen, VoiceOver has to do a lot. This, of course, means that if any of that changes within MacOS, VoiceOver has to be adapted to that change. And, if by Apple’s natural secrecy the Mac Accessibility team doesn’t know about it, that feature will break. This has happened several times over the years to the drag and drop feature.

Another issue was created by the TouchBar. VoiceOver not only had to be taught how to handle it, but had to also have all the keys that rely on the function keys moved. The awful workaround they came up with is to move those to holding down the FN key and using the number row at the, now very top, of the keyboard. This means that, on a MacBook, a user has to hold Option, Control, and Function with the left hand, while the right hand finds the number 2, for example. This is an ergonomic nightmare.

Then, when I’m done with the Mac, and ready to lock it when I’m about to leave my desk at work, or just step away for a moment, the keyboard command is Command + Control + q. This sounds simple enough, but if you look down at a MacBook keyboard, you realize that there isn’t a control key on the right side of the keyboard. So, you again have to scrunch up your hands and press Command and Control with your right hand, while your left hand presses q. Again, this is an ergonomic nightmare. Not even Linux has this issue.


Everything New is Hidden


When I was a teenager, I used to enjoy finding new things in iOS, things not talked about by most people or news outlets. When iOS 7 or 8 came out, I’d always go looking through the VoiceOver settings to see if I could find anything new, from voices to features. Sometimes it paid off. Usually it didn’t.

Now though, I’m 28. I have a job, a place to live, and other interests besides diving into operating systems (although of course that pops up often enough). I don’t get that satisfaction of finding “hidden” features and settings in VoiceOver. Instead, I sigh in frustration at the lack of documentation. Whenever I went to a “tech talk” phone conference for blind peopl once, when given a chance to learn about something, they chose the iPhone. And I don’t blame them; Apple makes it almost necessary to have someone train a blind person on using it. In the back of people’s minds, they’re always wondering “What feature do I not know about? How can I select and copy text? How can I easily answer a phone call?” Yes, I’ve had to teach that before. And it’s sad, not because of the user who knows nothing about their phone, but that Apple will not take the time to teach them through onboard help or tutorials.

The same can be said about the Mac. New features, like getting a description of an image, is not in the VoiceOver tutorial on the Mac. It is in the manual, which is great, but users don’t want to have to read a manual to figure out what will help them, and what’s just more stuff they won’t use. Another issue is that until very recently, the What’s New section of the VoiceOver user guide included updates from High Sierra. This was  <https://support.apple.com/guide/voiceover/whats-new-in-voiceover-vo15627/mac> updated in MacOS Monterey. I mean, I just learned one can markup documents with VoiceOver. I wonder what else I’m missing, or have missed, in the years of poor documentation.


Busy or Not Responding


In the blind Mac community, the most dreaded word a user could hear was “busy”. This is what happens when a program, usually Safari, sometimes even because of VoiceOver, becomes unresponsive. VoiceOver, admirably, doesn’t just stay silent like a lot of Windows screen readers, but does announce “busy” which is admittedly just as frustrating.

Recently, Apple has changed the terse “busy” into “not responding”. This is more explanatory, but still doesn’t help, especially if VoiceOver is the reason why an app goes unresponsive. This doesn’t happen as much on M1-based Macs, but not everyone has one of those.


Writing


Writing is the one thing I love to do on a Mac. Apple has perfected the art of showing things like spelling errors, auto-corrections, and things like that in an accessible, but not too noisy, way. I usually use  <https://ulysses.app/> The Ulysses writing app, since Pages is really complex to use, and I like writing in plain text anyway. VoiceOver can’t change its speech parameters to show formatting like Emacspeak or Narrator, but it at least can play a little beep to show a style change.


Reading


Reading, on the other hand, is pretty bad. When you open a book in the Books app, it’s hard to know how to have it continuously read the book, flipping pages. On the iPhone, you just find the book content, and swipe down with two fingers, and the book is read, flipping pages automatically. In pages, you have to interact with severl layers of the interface, and then when you perform the “say all” command, VO + a, it rarely flips the page and continues reading.

Even if you do get the page to turn, it sometimes doesn’t read the next page. So you have to navigate away from the book content, and back to it. I’m used to reading a page of content at a time from Emacs with Emacspeak. But having to navigate around to read the next page is just more than I’m willing to deal with. I guess blind people are only supposed to read on their iPhones. That’s unfortunate.


Faster isn’t Better


When Apple released their M1 chips, a lot of blind people wondered how it would effect VoiceOver, and thus, their use of the Mac. It turns out that it barely did. VoiceOver on an M1 Mac is still the same VoiceOver. There are fewer instances of “busy,” but it’s still the same. If VoiceOver gets tuck in Safari somewhere, it’ll still be stuck, no matter how fast the chip is. If VoiceOver can’t read a book, it still will not read that book, no matter how fast the chips is. If VoiceOver can barely work with the Terminal, it will barely work with the terminal, no matter how fast the chip is.

The inclusion of iPad apps was a nice change to MacOS, which has had a rather stagnant app ecosystem. But, if you use VoiceOver, iPad apps are sluggish to use, unless you turn off sounds. Again, it does not matter how good hardware is, unless the software is just as good.


Community support


After the uptick in blind people owning Mac computers, a community of users arose to discuss it. These weren’t the type to dig into the terminal, they usually just used Safari, Mail, Pages, and the usual Apple apps. As time went on, developers tried using the Mac, but were rebuffed by XCode’s bad accessibility. Those developers, however, had to do something with this expensive Mac they now no longer use.

So, one developer made a screen reader, called  <https://github.com/tspivey/tdsr> TDSR. This stands for “two day screen reader,” since it was made primarily in two days. It fixes everything wrong with VoiceOver’s awful terminal reading skills. Other developers have built apps like  <https://tweesecake.app/> TweeseCake, an app for connecting to several social networks. This really isn’t enough, though. We need book readers, braille translation and embossing software, games, and speech synthesizer support for more voice options. The fact is, Windows is just where it’s at with the blind community, and Apple may never get that level of hype back that they had in 2010.


An Alternative Ecosystem


Most blind people are thus stuck with an Apple phone on one side, and a Windows PC on the other. This means that they cannot text from their computer. They cannot receive notifications from their phone on their computer. They cannot make and answer calls. They cannot run apps they create on their phone. But there is another way.

Android may not have screen or image recognition yet. Its Braille support is, for now, a slap in the face of every Braille lover out there. Google TTS sounds robotic and low quality in TalkBack. But it at least allows one to build an ecosystem with it. One can text, from messages.google.com. One can make and receive phone calls and read notifications using Link to Windows, although this part is limited to particular phones, like from Samsung. One can even use the Android phone from the computer keyboard, with the Windows screen reader speaking what the phone is saying.

With Windows, one can run Android apps natively, using a screen reader that doesn’t have embarrassing bugs created by running on a different OS that it’s used to. It’s a very nice, solid experience, unlike Apple’s.

Are there bugs in Windows? Yes, plenty. Is Windows as nice when it comes to writing? No, hardly. It doesn’t even have a built-in dictionary. And if it does, it’s not accessible. Auto correct doesn’t tell you what it corrects, just like ChromeOS. Word suggestions are noisy and get in the way. But at least you can read a book, with software made by the blind, for the blind. At least you can browse Reddit easily, and using the web is simple, fast, and easy, with no way for a screen reader to get stuck on something in the document.

 

     David Goldfield 

Assistive Technology Specialist

 

Feel free to visit my Web site

WWW.DavidGoldfield.info

_._,_._,_

  _____  

Groups.io Links:

You receive all messages sent to this group. 

View/Reply Online (#2081) <https://groups.io/g/tech-vi/message/2081>  | Reply To Group <mailto:tech-vi at groups.io?subject=Re:%20%3D%3FUTF-8%3FB%3FW3RlY2gtdmkgQW5ub3VuY2UgTGlzdF0gQXBwbGXigJlzIFNwbGludGVyZWQgQWNjZXNzaWJpbGl0eS4gV2h5IEkgdXNlIEFuZHJvaWQgYW5kIFdpbmRvd3MgfCBieSBEZXZpbiBQcmF0ZXIgfCBNYXksIDIwMjIgfCBNZWRpdW0%3D%3F%3D>  | Reply To Sender <mailto:david.goldfield at outlook.com?subject=Private:%20Re:%20%3D%3FUTF-8%3FB%3FW3RlY2gtdmkgQW5ub3VuY2UgTGlzdF0gQXBwbGXigJlzIFNwbGludGVyZWQgQWNjZXNzaWJpbGl0eS4gV2h5IEkgdXNlIEFuZHJvaWQgYW5kIFdpbmRvd3MgfCBieSBEZXZpbiBQcmF0ZXIgfCBNYXksIDIwMjIgfCBNZWRpdW0%3D%3F%3D>  | Mute This Topic <https://groups.io/mt/91329755/1772840>  | New Topic <https://groups.io/g/tech-vi/post> 
Your Subscription <https://groups.io/g/tech-vi/editsub/1772840>  | Contact Group Owner <mailto:tech-vi+owner at groups.io>  | Unsubscribe <https://groups.io/g/tech-vi/leave/10902235/1772840/856776330/xyzzy>  [chong.curtis at gmail.com]

_._,_._,_



More information about the NFBCS mailing list