[NFB-Science] Multi-Sensory Access to James Webb Telescope Images

Tina Hansen th404 at comcast.net
Wed Apr 3 17:36:08 UTC 2024


I noticed the article in this month's Braille Monitor about image poverty.
Access to visual images has been an eternal challenge in the blindness
community.

 

This, and the interest in the fantastic descriptions of the images from the
James Web Telescope, got me wondering about something. Can we find ways to
create multi-sensory approaches to access these images with touch and audio?
I know some attempts at sonifying the images have been talked about, but I
also like the idea of using the descriptions for context, using some kind of
sonification, and having the model be tactile. The descriptions could be
narrated by a skilled voice talent if they were standing alone, or by
whatever voice you have on your computer. I also like the idea of being
given a choice. I like the idea of using skilled talents, just because many
of us get enough of JAWS or Voiceover and need a break.

 

But I also wonder if apps could help out, as we all witnessed at the last
total solar eclipse, and will likely notice at this one.

 

So can something like this be done? I'm curious what is already out there,
and what we can do to create something like this. Any thoughts? Thanks.



More information about the NFB-Science mailing list