[Nfbktad] Article: Switching back to Windows

Gatton, Tonia (OFB-LV) Tonia.Gatton at ky.gov
Mon Feb 24 15:08:57 UTC 2014


Interesting article and some food for thought.

Tonia



Switching back to Windows
Posted on
February 7, 2014
by
Marco

Yes, you read correctly! After
five years on a Mac
as my private machine, I am switching back to a Windows machine in a week or so, depending on when Lenovo's shipment arrives.

You are probably asking yourself, why I am switching back. In this post, I'll try to give some answers to that question, explain my very personal views
on the matters that prompted this switch, and give you a bit of an insight into how I work and why OS X and VoiceOver no longer really fit that bill for
me.

A bit of history

When I started playing with a Mac in 2008, I immediately realised the potential this approach that Apple was taking had. Bundling a screen reader with the
operating system had been done before, on the GNOME desktop, for example, but Apple's advantage is that they control the hardware and software back to
front and always know what's inside their boxes. So a blind user is always guaranteed to get a talking Mac when they buy one.

On Windows and Linux, the problem is that the hardware used is unknown to the operating system. On pre-installed systems, this is usually being taken care
of, but on custom-built machines with standard OEM versions of Windows or your Linux distro downloaded from the web, things are different. There may be
this shiny new sound card that just came out, which your dealer put in the box, but which neither operating system knows about, because there are no drivers.
And gone is the dream of a talking installation! So, even when Windows 8 now allows Narrator to be turned on very early in the installation process in
multiple languages even, and Orca can be activated early in a GNOME installation, this all is of no use if the sound card cannot be detected and the speech
synthesizer canot output its data through conected speakers.

And VoiceOver had quite some features already when I tried it in OS X 10.5 Leopard: It had web support, e-mail was working, braille displays, too, the Calendar
was one of the most accessible on any desktop computer I had ever seen, including Outlook's calendar with the various screen readers on Windows, one of
which I had even worked on myself in earlier years, and some third-party apps were working, too. In fact, my very first Twitter client ran on the Mac,
and it was mainstream.

There was a bit of a learning curve, though. VoiceOver's model of interacting with things is quite different from what one might be used to on Windows at
times. Especially interacting with container items such as tables, text areas, a web page and other high-level elements can be confusing at first. If you
are not familiar with VoiceOver, interacting means zooming into an element. A table suddenly gets rows and columns, a table row gets multiple cells, and
each cell gets details of the contained text when interacting with each of these items consecutively.

In 2009, Apple advanced things even further when they published Snow Leopard (OS X 10.6). VoiceOver now had support for the trackpads of modern MacBooks,
and when the Magic TrackPad came out later, it also just worked. The Item Chooser, VoiceOver's equivalent of a list of links or headings, included more
items to list by, and there was now support for so-called web spots, both user-defined and automatic. A feature VoiceOver calls Commanders allowed the
assignment of commands to various types of keystrokes, gestures, and others. If you remember: Snow Leopard cost 29 us Dollars, and aside from a ton of
new features in VoiceOver, it obviously brought all the great new features that Snow Leopard had in store for everyone. A common saying was: Other screen
readers needed 3 versions for this many features and would have charged several hundred dollars of update fees. And it was a correct assessment!

In 2011, OS X 10.7 Lion came out, bringing a ton of features for international users. Voices well-known from iOS were also made available in desktop formats
for over 40 languages, international braille tables were added, and it was no longer required to purchase international voices separately from vendors
such as
AssistiveWare..
This meant that users in more countries could just try out VoiceOver on any Mac in an Apple retail store or a reseller's place. There were more features
such as support for WAI-ARIA landmarks on the web, activities, which are either application or situation-specific sets of VoiceOver settings, and better
support for the Calendar, which got a redesign in this update.

First signs of trouble

But this was also the time when first signs of problems came up. Some things just felt unfinished. For example: The international braille support included
grade 2 for several languages, including my mother tongue German. German grade 2 has a thing where by default, nothing is capitalized. German capitalizes
many more words than English, for example, and it was agreed a long time ago that only special abbreviations and expressions should be capitalized. Only
in learning material, general orthographic capitalization rules should be used. In other screen readers, capitalization can be turned on or off for German
and other language grade 2 (or even grade 1). Not so in VoiceOver for both OS X and iOS. One is forced to use capitalization. This makes reading quite
awkward. And yes, makes, because this remains an issue in both products to this date. I even entered a bug into Apple's bug tracker for this, but it was
shelved at some point without me being notified.

Some other problems with braille also started to surface. For some inexplicable reason, I often have to press routing buttons twice until the cursor appears
at the spot I want it to when editing documents. While you can edit braille verbosity where you can define what pieces of information are being shown for
a given control type, you cannot edit what gets displayed as the control type text. A "closed disclosure triangle" always gets shown as such, same as an
opened one. On a 14 cell display, this takes two full-length displays, on a 40 cell one, it wastes most of the real estate and barely leaves room for other
things.

Other problems also gave a feeling of unfinished business. The WAI-ARIA landmark announcement, working so well on iOS, was very cumbersome to listen to
on OS X. The Vocalizer voices used for international versions had a chipmunk effect that was never corrected and, while funny at first, turned out to be
very annoying in day-to-day use.

OK, the enthusiastic Mac fan boy that I was, thought, let's report these issues and also wait for the updates to trickle in. None of the 10.7 updates really
fixed the issues I was having.

Then a year later, Mountain Lion, AKA OS X 10.8, came out, bringing a few more features, but compared to the versions before, much much less. Granted, it
was only a year between these two releases, whereas the two cycles before had been two years each, but the features that did come in weren't too exciting.
There was a bit polish here and there with drag and drop, one could now sort the columns of a table, and press and hold buttons, and a few little things
more. Safari learned a lot new HTML5 and more WAI-ARIA and was less busy, but that was about it. Oh yes and one could now access all items in the upper
right corner of the screen. But again, not many of the previously reported problems were solved, except for the chipmunk effect.

There were also signs of real problems. I have a Handy Tech Evolution braille display as a desktop braille display, and that had serious problems from one
Mountain Lion update to the next, making it unusable with the software. It took two or three updates, distributed over four or five months, before that
was solved, basically turning the display into a useless piece of space-waster.

And so it went on

And 10.9 AKA Mavericks again only brought a bit polish, but also introduced some serious new bugs. My Handy Tech BrailleStar 40, a laptop braille display,
is no longer working at all. It simply isn't being recognized when plugged into the USB port. Handy Tech are aware of the problem, so I read, but since
Apple is in control of the Mac braille display drivers, who knows when a fix will come, if at all in a 10.9 update. And again, old bugs have not been fixed.
And new ones have been introduced, too.

Mail, for example, is quite cumbersome in conversation view now. While 10.7 and 10.8 very at least consistent in displaying multiple items in a table-like
structure, 10.9 simply puts the whole mail in as an embedded character you have to interact with to get at the details. It also never keeps its place,
always jumping to the left-most item, the newest message in the thread.

The Calendar has taken quite a turn for the worse, being much more cumbersome to use than in previous versions. The Calendar UI seems to be a subject of
constant change anyway, according to comments from sighted people, and although it is technically accessible, it is no longer really usable, because there
are so many layers and sometimes unpredictable focus jumps and interaction oddities.

However, having said that, an accessible calendar is one thing I am truly going to miss when I switch back to Windows. I know various screen readers take
stabs at making the Outlook calendar accessible, and it gets broken very frequently, too. At least the one on OS X is accessible. I will primarily be doing
calendaring from my iOS devices in the future. There, I have full control over things in a hassle-free manner.

iBooks, a new addition to the product, is a total accessibility disaster with almost all buttons unlabeled, and the interface being slow as anything. Even
the update issued shortly after the initial Mavericks release didn't solve any of those problems, and neither did the 10.9.1 update that came out a few
days before Christmas 2013.

>From what I hear, Activities seem to be pretty broken in this release, too. I don't use them myself, but heard that a friend's activities all stopped working,
triggers didn't fire, and even setting them up fresh didn't help.

Here comes the meat

And here is the first of my reasons why I am switching back to Windows: All of the above simply added up to a point where I lost confidence in Apple still
being dedicated to VoiceOver on the Mac as they were a few years ago. Old bugs aren't being fixed, new ones introduced and, despite the beta testers, which
I was one of, reporting them, were often not addressed (like the Mail and Calendar problems, or iBooks). Oh yes, Pages, after four years, finally became
more accessible recently, Keynote can now run presentations with VoiceOver, but these points still don't negate the fact that VoiceOver itself is not receiving
the attention any more that it would need to as an integrated part of the operating system.

The next point is one that has already been debated quite passionately on various forums and blogs in the past: VoiceOver is much less efficient when browsing
the web than screen readers on Windows are. Going from element to element is not really snappy, jumping to headings or form fields often has a delay, depending
on the size and complexity of a page, and the way Apple chose to design their modes requires too much thinking on the user's part. There is single letter
quick navigation, but you have to turn on quick navigation with the cursor keys first, and enable the one letter quick navigation separately once in the
VoiceOver utility. When cursor key quick navigation is on, you only navigate via left and right arrow keys sequentially, not top to bottom as web content,
which is still document-based for the most part, would suggest. The last used quick navigation key also influences the item chooser menu. So if I moved
to a form field last via quick navigation, but then want to choose a link from the item chooser, the item chooser opens to the form fields first. I have
to left arrow to get to the links. Same with headings. For me, that is a real slow-down.

Also, VoiceOver is not good at keeping its place within a web page. As with all elements, once interaction stops, then starts again, VoiceOver starts interaction
at the very first element. Conversations in Adium or Skype, and even the Messages app supplied by Apple, all suffer from this. One cannot jump into and
out of the HTML area without losing one's place. Virtual cursors on Windows in various screen readers are very good at remembering the spot they were at
when focus left the area. And even Apple's VoiceOver keystroke to jump to related elements, which is supposed to jump between the input and HTML area in
such conversation windows, is a subject of constant breakage, re-fixing, and other unpredictability. It does not even work right in Apple's own Messages
app in most cases.

Over-all, there are lots of other little things when browsing the web which add up to make me feel I am much less productive when browsing the web on a
Mac than I am on Windows.

Next is VoiceOver's paradigm of having to interact with many elements. One item where this also comes into play is text. If I want to read something in
detail, be it on the web, a file name, or basically anything, I have to interact with the element, or elements, before I get to the text level, read word
by word or character by character, and then stop interaction as many times as I started it to get back to where I was before wanting to read in detail.
Oh yes, there are commands to read and spell by character, word, and sentence, but because VoiceOver uses the Control+Option keys as its modifiers, and
the letters for those actions are all located on the left-hand side of the keyboard, it means I have to take my right hand off its usual position to press
these keys while the left hand holds the Control and Option keys. MacBooks as well as the Apple Wireless Keyboard don't have Control and Option keys on
both sides, and my hand cannot be bent in a fashion that I can grab these keys all with one hand. Turning on and off the VoiceOver key lock mechanism for
this would add even more cumbersome to the situation.

And this paradigm of interaction is also applied to the exploration of screen content by TrackPad. You have to interact or stop interacting with items constantly
to get a feel for the whole screen. And even then, I often feel I never get a complete picture. Unlike on iOS, where I always have a full view of a screen.
Granted, a desktop screen displays far more information than could possibly fit on a TrackPad without being useless millimeter touch targets, but still
the hassle of interaction led to me not using the TrackPad at all except for some very seldom specific use cases. We're talking about a handful instances
per year.

Next problem I am seeing quite often is the interaction braille gives me. In some cases, the output is just a dump of what speech is telling me. In other
cases, it is a semi-spacial representation of the screen content. In yet another instance, it may be a label with some chopped off text to the right, or
to the left, with the cursor not always in predictable positions. I already mentioned the useless grade 2 in German, and the fact that I often have to
press the routing button at least twice before the cursor gets where I want it to go. The braille implementation in VoiceOver gives a very inconsistent
impression, and feels unfinished, or done by someone who is not a braille reader and doesn't really know the braille reader's needs.

Next problem: Word processing. Oh yes, Pages can do tables in documents now, and other stuff also became more accessible, but again because of the paradigms
VoiceOver uses, getting actual work done is far more cumbersome than on Windows. One has, for example, to remember to decouple the VoiceOver cursor from
the actual system focus and leave that inside the document area when one wants to execute something on a tool bar. Otherwise, focus shifts, too, and a
selection one may have made gets removed, rendering the whole endeavor pointless. Oh yes, and one has to turn the coupling back on later, or habits will
result in unpredictable results because the system focus didn't move where one would have expected it to. And again, VoiceOver's horizontally centered
navigation paradigm. Pages of a document in either Pages or Nisus Writer Pro appear side by side, when they are visually probably appearing below one another.
Each page is its own container element. All of this leaves me with the impression that I don't have as much control over my word processing as I have in
MS Word or even the current snapshot builds of OpenOffice or LibreOffice on Windows. I also get much more information that I don't have to look for explicitly,
for example the number of the current page. NVDA, but probably others, too, have multilingual document support in Word. I  immediately hear which spell
checking is being used in a particular paragraph or even sentence.

There are some more issues which were not addressed to this day. There is no PDF reader I know of on OS X that can deal with tagged (accessible) PDFs. Even
when tags are present, Preview doesn't do anything with them, giving the more or less accurate text extraction that one gets from untagged PDFs. As a result,
there is no heading navigation, no table semantic output, and more that accessible PDFs support.

And the fact that there is no accessible Flash plug-in for web browsers on OS X also has caused me to switch to a Windows VM quite often just to be able
to view videos embedded in blogs or articles. Oh yeah, HTML5 video is slowly coming into more sites, but the reality is that Flash is probably still going
to be there for a couple of years. This is not Apple's fault, the blame here is solely to be put on Adobe for not providing an accessible Flash plug-in,
but it is one more thing that adds to me not being as productive on a Mac as I want to be on a desktop computer.

Conclusion

In summary: By all of the above, I do not mean to say that Apple did a bad job with VoiceOver to begin with. On the contrary: Especially with iOS, they
have done an incredibly good job for accessibility in the past few years. And the fact that you can nowadays buy a Mac and install and configure it fully
in your language is highly commendable, too! I will definitely miss the ability to configure my system alone, without sighted assistance, should I need
to reinstall Windows. As I said above, that is still not fully possible without assistance. It is just the adding up of things that I found over the years
that caused me to realize that some of the design decisions Apple has made for OS X, bugs that were not addressed or things get broken and not fixed, and
the fact that apps are either accessible or they aren't, and there's hardly any in-between, are not compatible with my way of working with a desktop computer
or laptop in the longer term. For iOS, I have a feeling Apple are still full-steam ahead with accessibility, introducing great new features with each release,
and hopefully also fixing braille problems as discussed by Jonathan Mosen in
this great blog post.
For OS X, I am no longer as convinced their heart is in it. As I have a feeling OS X itself may become a second-class citizen behind iOS soon, but that,
again, is only my personal opinion.

So there you have it. This is why I am going to be using a Lenovo notebook with
NVDA
as my primary screen reader for my private use from now on. I will still be using a Mac for work of course, but for my personal use, the Mac is being replaced.
I want to be fast, effective, productive, and be sure my assistive technology doesn't suddenly stop working with my braille display or be susceptible to
business decisions placing less emphasis on it. Screen readers on Windows are being made by independent companies or organizations with their own business
models. And there is choice. If one does no longer fit particular needs, another will most likely do the trick. I do not have that on OS X.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nfbnet.org/pipermail/nfbktad_nfbnet.org/attachments/20140224/e1edebcc/attachment.html>


More information about the NFBKTAD mailing list