[Nfbf-l] How scientists are helping blind people see with their ears.
Alan Dicey
adicey at bellsouth.net
Fri Nov 7 23:13:26 UTC 2014
How scientists are helping blind people see with their ears.
Updated by Susannah Locke on November 7, 2014,
@susannahlocke susannah at vox.com
Bats, dolphins, and even some whales all use sonar to determine the location
of objects around them - by sending out sound waves and listening to how
they bounce back. This allows these animals to do all sorts of amazing
things, like hunt in total darkness.
And it turns out that humans can use sonar, too (and not just in
submarines).
Some blind people are capable of using tongue clicks to "see" their
surroundings. They make a sharp sound with their tongue and listen carefully
to how the sound reflects off the objects around them.
So, more recently, researchers have been trying to push this capability
much, much further.
New technologies - lasers, cameras, and earphones - can give people even
greater sonar capacity. More radically still, some researchers have now
developed software that essentially translates the world into music, which
can help blind people avoid obstacles, recognize facial expressions, and
even read letters.
How sonar works - and how humans can use it.
The basic idea of sonar is to send out a sound and then time how long it
takes for the sound to bounce back. That can give a sense of how far away
various objects are. Submarines do this, as do animals (it's often referred
to here as "echolocation").
And a few humans have mastered the trick. Take Daniel Kish, who has been
blind since childhood and can echolocate by clicking his tongue. Using this
technique, he says that he can see objects fairly far away, as long as
they're at least the size of a softball. (Kish is president of the
non-profit World Access for the Blind foundation, which among other things
promotes the teaching of echolocation.)
And human echolocation has also attracted the attention of academic
researchers.
One group in Spain determined in 2010 that tongue clicking was more
successful than snapping or clapping. And in 2011, a study led by David
Whitney of the University of California at Berkeley found that six blind
echolocators with at least 10,000 hours of echolocation practice had a
spatial precision that was "comparable to that found in the visual periphery
of sighted individuals."
New technology could enhance human sonar further.
In 2009, a research collaboration including a group at the Polytechnic
University of Valencia, Spain, unveiled a helmet that takes real-time images
of the world, distills essential information out of it, combines it with
depth data from a laser range-finder, and presents that information as audio
cues through headphones. This is essentially an enhanced version of human
sonar - one augmented by technology.
Similar projects have popped up elsewhere.
The SmartCane is a combination cane and ultrasound system that sells for
approximately $50 in India. And Tacit is a similar, open-source project from
inventor Steve Hoefer, which translates distance information into haptic
feedback - vibration on the user's hand.
Some of the most impressive research in this arena, meanwhile, comes from
the laboratory of Amir Amedi, a neuroscientist at The Hebrew University of
Jerusalem. Amedi's group has developed a program called EyeMusic that
essentially translates images into short musical pieces.
And in 2012, the group showed that blind people can use this program to read
letters and even recognize facial expressions - after only tens of hours of
training.
How does it work?
The software that Amedi's group has created scans an image from left to
right over the time of the musical piece. The higher the pixel in the image,
the higher the pitch that is played, and different colors are represented by
different musical instruments. Here are some examples:
(no direct link to EyeMusic video available)
EyeMusic is currently available as an app from the iTunes store, if you'd
like to check it out.
And interestingly, these participants' brains used what has been previously
thought of as visual areas when doing these tasks. "The input is arising
through the ears, but then being delivered into the visual system," says
Amedi, which suggests that these are task-related brain areas, not
vision-specialized brain areas.
In 2014, Amedi introduced the EyeCane, a small, handheld device that uses
two narrow infrared beams to detect nearby obstacles and translate them to
either sound or vibration - depending on the user's preference. It was
intuitive enough to require almost no training, and people could use it to
detect an open door about 15 feet away.
Going further still: Using sound to see the world in ultraviolet.
This technology won't just benefit the blind. Amedi says his lab is also
exploring the possibilities of using this technology to help people "see"
through walls (by sensing infrared).
And artist Neil Harbisson has already used similar technology to give
himself essentially superhuman capabilities.
Harbisson was born completely color-blind, with only grayscale vision. But
he now has an sensor implanted in his skull that detects the colors of
nearby objects and translates them into different musical notes produced by
a chip in his head. That helps him "see" the colors of the world around him.
But the color sensor also picks up things that no human can naturally see -
light in the ultraviolet and infrared ranges. So this means, for example,
that Harbisson can sense the invisible infrared signal of a TV remote or
motion detector.
Here's his incredible TED talk from 2012:
https://www.youtube.com/watch?v=ygRNoieAnzI
So human sonar might not just help restore sight to the blind. One day, it
could end up allowing everyone else to see things they've never been able to
see before.
Source URL:
http://www.vox.com/2014/11/7/7171119/blind-sonar-echolocation
_______________________________________________
More information about the NFBF-L
mailing list