[Art_beyond_sight_educators] Art Beyond Sight Awareness Month Alert 3
Lisa.Yayla at statped.no
Tue Oct 15 13:07:42 UTC 2013
Forgot to mention, forwarding - from Marie Clapot
Art Beyond Sight Awareness Month
CALLING ALL CALENDER ENTRIES
This is our final call for entries to
be included in the Art Beyond
Sight Calendar! Send us your
organization's name, event date,
time, location, and contact if
pre-registration is required. The
calendar is available on our
Email Alert 3
Promoting Access to the Arts for All
Building an Inclusive Society
seeing with sound
Written By Amanda Treco
Personally, I would never ride a bike in New York City
traffic, but some people will not be so easily deterred. Daniel Kish
confidently swerves through city streets with ease. This is even
more intriguing because Kish was born blind. Today, he has the
ability to avoid traffic, hike and bike through the mountains and
play various sports without the help of any tradition aids or devices
for the visually impaired or blind.
Kish, the Executive Director of World Access for the Blind.
taught himself to "see" using palate clicks when he
was still a young child. This is called echolocation.
Kish has taught others, both old and young, this technique
and has often been successful. It might sounds
outlandish that humans can learn to see with their
ears just like bats, whales, dolphins or shrews, but
most humans blind or sighted, can actually learn the
echolocation technique to sense objects in their surroundings
by using echoes produced by palate clicks.
Training can begin in childhood, through the use of
games to improve sensitivity to sound.
Researchers have found that training in echolocation can
improve awareness and broaden spatial and perceptual horizons
to include things that people never thought they had the ability to
sense without sight. "The Exotic Sensory Capabilities of Humans"
published by The Psychologist in 2013, describes how a skill as
unusual as echolocation actually has, "the very adaptive use to
acclimate to changes in context and availability of perceptual information
for a common function." The article, "Human Echolocation:
Blind And Sighted Persons' Ability to Detect Sounds Recorded In
The Presence Of A Reflecting Object" published in Perception,
discusses how people with sight tend to use light to build spatial
conceptions, which is why it might be more difficult for people with
sight to understand that spatial perception is based on the sounds
in their environment. By clicking his or her tongue, a person can
learn to identify objects and textures in their surroundings by using
the echoes made from the clicks.
Although humans don't have a hearing system as precise
and specialized as bats do, there is still the potential to learn how
to build spatial conceptions using echolocation. Dr. Thomas Stoffregen,
a Professor of Kinesiology at the University of Minnesota,
describes the difference between bats and humans' abilities in
using echolocation by explaining how, "low frequency sound provides
less information about the shapes and sizes of things. High
frequency sound (e.g., ultrasound) provides precise information
about shape and size, but it does not travel very far; it is rapidly
absorbed by the air. It's a trade-off. Bats sold the
trade-off by, in effect, shouting really loud all the
time. Blind people who use echolocation
(humans cannot make or hear ultrasound) don't
need to shout or tap as often as bats, but they
cannot achieve the same level of perceptual
precision." Since low frequency sounds travel
around 300 meters per second, we manage to
distinguish between the distances of sounds,
even if we remain unaware of how we perceive
Despite the fact that humans' abilities are not
quite as precise as a bat or dolphin, in "Humans Can Learn to
"See" With Sound", for National Geographic News, the scientist
Juan Antonio Martínez states that even though there are echolocation
devices such as flash sonar devices that can aid sight, the
palate clicking technique is the most effective method of echolocation.
The reasoning, according to Martínez is that "such devices
are worse than natural echolocation at present, because they don't
reproduce the complete haptic [touch] perception of the echoes."
The fact that humans who seemingly do not have access to all five
of the commonly recognized senses actually have the capability to
navigate through this world without relying on the help of technological
devices is quite a liberating revelation.
"Humans can learn to see with their ears, just like bats, whales, dolphins or
Ravilious, Kate. "Humans Can Learn to "See" With Sound, Study Says." National Geographic.
National Geographic Society, 6 July 2009. Web. 29 July 2013.
Rosenblum, Lawrence D., and Michael S. Gordon. "The Exotic Sensory Capabilities Of
Humans." The Psychologist 25.12 (2012): 904-907. PsychINFO. Web. 29 July 2013.
Schenkman, Bo N., and Mats E. Nilsson. "Human Echolocation: Blind And Sighted
Persons' Ability to Detect Sounds Recorded In The Presence Of A Reflecting Object."
Perception 39.4 (2010): 483-501. PsycINFO. Web. 29 July 2013.
"Speed of Sound." ScienceDaily. ScienceDaily, n.d. Web. 17 Aug. 2013.
INTERVIEW WITH DR. THOMAS STROFFREGEN
Dr. Thomas Stoffregen is a Professor of Kinesiology at
the University of Minnesota. He is interested in perception
and action, human factors, control of posture and
orientation, and ecological psychology. He has participated
in research on echolocation. In his article, "On
Specification and the Senses" in which he collaborated
with a Professor of Human Movement Science, Benoît
G. Bardy, Stoffregen challenges the "assumption that
perception is divided into separate domains of vision,
hearing, touch, taste, and smell". I interviewed him to
find out more about experiences and thoughts on the
topic of echolocation.
ABS: What does the work on action and perception, particularly intermodal
perception, have to do with echolocation?
DR. STOFFREGEN: Echolocation is a form of perception. When a blind person taps
their long cane (that is what it is called, not a cane) on the ground, they are using
action (arm movements) to improve their perception. Logically, echolocation is the
same: when an animal generates sounds that bounce off the environment and return to
the ear, the act of sound generation is being used to improve perception.
Intermodal perception is more complex. Bats use echolocation while in flight. Body movement, such as flight, creates
changes in perception. Think about walking; as you set your feet on the sidewalk, the soles of your feet are stimulated by
that physical contact, and the force and timing of those "sole pulses" are related to how fast and hard your are walking.
As the bat maneuvers, the wings beat, the body twists and turns--all these actions stimulate skin, joints, and the vestibular
system (inner ear). If those systems are not stimulated, it means the bat is stationary. Echolocation provides
information about *relative* motion; in the "sound field", "me moving toward bug" is equal to "bug moving toward me".
Stimulation of the body (as described above) tells the bat whether (and how) it is moving. So, the intermodal relation between sound and "feel" allows the bat to know *who* is moving, which is pretty important.
ABS: What do you think are the limitations of echolocation?
DR. STOFFREGEN: Low frequency sound provides less information about the shapes and sizes of things. High
frequency sound (e.g, ultrasound) provides precise information about shape and size, but it does not travel very far; it is rapidly absorbed by the air. It's a trade-off. Bats sole the trade-off by, in effect, shouting really loud all the time. Blind people who use echolocation (humans cannot make or hear ultrasound) don't need to shout/tap as often as bats, but they cannot achieve the same level of perceptual precision.
ABS: Can you speculate on the potential of echolocation for the future?
DR. STOFFREGEN: Several groups of researchers are working to develop echolocation-like systems for vehicles so that, for example, self-driving cars may be able to navigate, avoid collisions, etc. Other people are developing bat-like echolocation systems to aid the blind. Not all blind people use echolocation, and no humans can use ultrasound. So, a gizmo that generates ultrasound and can process the echoes could provide a useful navigation aid.
Imagine, directed and written
by Andrzej Jakimowski in
2012, starring Edward Hogg, Alexandra Maria Lara and Melchior Derouet centers
around a blind teacher, Ian (Hogg), who breaks the rules by integrating the use of
echolocation into his lessons to help his students learn to move with ease through their
lives. This romantic drama is a must see, while also providing a unique and powerful illustration of how
ART BEYOND SIGHT
New York, NY 10012
FILM ON ECHOLATION
-Scanned by Exchange Hosted Services-
More information about the Art_Beyond_Sight_Educators