[Artbeyondsightmuseums] Maps, mapping
fnugg at online.no
fnugg at online.no
Sun Sep 29 05:46:19 UTC 2013
Calgary company makes perfect pitch for top prize at tech showcase
Invici (pronounced 'en VEE cee') actually sprang from Hagedorn's work on
his Master's Degree in Geography at the University of Calgary; he
learned how difficult it was for some to use maps, and so he began to
explore other ways to represent geographic features and information.
Invici is an acronym for 'non-visual cartography'.
Now Hagedorn plans to change the lives of blind and visually impaired
people with a multi-touch computer interface that lets them interact
with on-screen information in totally new ways.
http://www.calgaryherald.com/health/Calgary+company+makes+perfect+pitch+prize+tech+showcase/8923266/story.html
Mapping a room in a snap
Blind people sometimes develop the amazing ability to perceive the
contours of the room they're in based only on auditory information. Bats
and dolphins use the same echolocation technique for navigating in their
environment. At EPFL, a team from the Audiovisual Communications
Laboratory (LCAV), under the direction of Professor Martin Vetterli, has
developed a computer algorithm that can accomplish this from a sound
that's picked up by four microphones. Their experiment is being
published this week in the /Proceedings of the National Academy of
Sciences/ (/PNAS/). "Our software can build a 3D map of a simple, convex
room with a precision of a few millimeters," explains PhD student Ivan
Dokmanic'.
Randomly placed microphones
As incredible as it may seem, the microphones don't need to be carefully
placed. "Each microphone picks up the direct sound from the source, as
well as the echoes arriving from various walls," Dokmanic' continues.
"The algorithm then compares the signal from each microphone. The
infinitesimal lags that appear in the signals are used to calculate not
only the distance between the microphones, but also the distance from
each microphone to the walls and the sound source."
http://esciencenews.com/articles/2013/06/17/mapping.a.room.a.snap
Echolocating app will let you map a room with sound
Bats, dolphins and even some blind people use echoes to create a mental
3D map of their environment and where they are in it. A smartphone's
chirp could soon let us do the same.
Ivan Dokmanic and colleagues at the Swiss Federal Institute of
Technology (EPFL) in Lausanne have developed a system capable of
reconstructing the shape of a room -- and where you are in it - using
echoes.
What's key to the trick, say human echolocators
<http://www.newscientist.com/article/mg20227031.400-echo-vision-the-man-who-sees-with-sound.html>,
is sensing the strong early reflections off the walls, rather than the
noisy, confusing mishmash of late-arriving, weaker echoes.
Hoping to computerise this process of 3D visualisation, the EPFL team
has developed a system capable of reconstructing the shape of a room
using these "first order" echoes. To do this, the researchers wrote an
echo-sorting algorithm which can discriminate between the first and
later echoes.
To test their plan they set up a loudspeaker and five microphones in
Lausanne cathedral. The speaker briefly emitted an audible chirp -
sweeping from 200 hertz to 10 kilohertz - and the reflections were
analysed to successfully reveal the cathedral's 3D shape and the
location of the sound source (/PNAS/, DOI: 10.1073/pnas.1221464110).
http://www.newscientist.com/article/dn23709-echolocating-app-will-let-you-map-a-room-with-sound.html#.Uke7Ur44WRs
Computer algorithm uses echoes to create 'virtual' room maps
A new computer algorithm that can give humans the ability to map their
environments with sound could lead to an app to aid blind people, Swiss
researchers say.
Some animals such as bats, whales and dolphins use echolocation --
emitting a sound and listening to the echo -- to create a mental map of
their environment, and some blind people have learned to use finger
snaps or tongue clicks to create a rough equivalent, they said.
Read more:
http://www.upi.com/Science_News/Technology/2013/06/18/Computer-algorithm-uses-echoes-to-create-virtual-room-maps/UPI-50171371594122/#ixzz2gG00IBwQ
http://www.upi.com/Science_News/Technology/2013/06/18/Computer-algorithm-uses-echoes-to-create-virtual-room-maps/UPI-50171371594122/
A New Echolocation Algorithm Can Map Spaces Based on Sound Alone
<http://gizmodo.com/a-new-echolocation-algorithm-can-map-spaces-based-on-so-561634511>
http://gizmodo.com/a-new-echolocation-algorithm-can-map-spaces-based-on-so-561634511
How software works out room shape
http://www.iol.co.za/scitech/technology/software/how-software-works-out-room-shape-1.1537948#.Uke8Rr44WRs
Blind Maps: Braille Navigation System Concept
A couple of years ago we learned about Plan.B
<http://technabob.com/blog/2010/11/02/plan-b-concept-map-for-the-blind/>, a
concept for a map for blind people. The idea behind that system was
sound, but I thought the execution left much to be desired. I like this
other concept called Blind Maps
<http://www.industrialdesignserved.com/gallery/Blind-Maps/2951161> much
more. It's supposed to be a Bluetooth add-on for the iPhone that
provides Braille-like turn-by-turn navigation.
http://technabob.com/blog/2013/02/22/blind-maps-braille-navigation/
Plan.B Concept: a Map for the Blind
I'm amazed every time I see blind people walking around on their own
using only a cane to guide them. But what if there was also a way to
make the sight-impaired "see" the surrounding geography? That's the idea
behind Robert Richter's concept device, plan.b. Plan.b is a digital
device that applies a simplified version of the Braille system, making
tactile versions of maps.
http://technabob.com/blog/2010/11/02/plan-b-concept-map-for-the-blind/
Blind Maps
http://www.industrialdesignserved.com/gallery/Blind-Maps/2951161
Autonomous Vehicle Technology Could Help Blind to Navigate
http://www.digitaljournal.com/pr/1476013
Blind Robot Gets Around
http://inventorspot.com/articles/blind_robot_gets_around
Biologists at LMU have demonstrated that people can acquire the capacity
for echolocation
<http://medicalxpress.com/news/2013-08-humans-echo-suppression-differently.html>,
although it does take time and work.
As blind people can testify, we humans can hear more than one might
think. The blind learn to navigate using as guides the echoes of sounds
they themselves make. This enables them to sense the locations of walls
and corners, for instance: by tapping the ground with a stick or making
clicking sounds with the tongue, and analyzing the echoes reflected from
nearby surfaces, a blind person can map the relative positions of
objects in the vicinity. LMU biologists led by Professor Lutz Wiegrebe
of the Department of Neurobiology (Faculty of Biology) have now shown
that sighted people can also learn to echolocate objects in space, as
they report in the biology journal /Proceedings of the Royal Society B/.
Wiegrebe and his team have developed a method for training people in the
art of echolocation. With the help of a headset consisting of a
microphone and a pair of earphones, experimental subjects can generate
patterns of echoes that simulate acoustic reflections in a virtual
space: the participants emit vocal clicks, which are picked up by the
microphone and passed to a processor that calculates the echoes of a
virtual space within milliseconds. The resulting echoes are then played
back through the earphones. The trick is that the transformation applied
to the input depends on the subject's position in virtual space. So the
subject can learn to associate the artificial "echoes" with the
distribution of sound-reflecting surfaces in the simulated space.
*A dormant skill*
"After several weeks of training, the participants in the experiment
were able to locate the sources of echoes pretty well. This shows that
anyone can learn to analyze the echoes of acoustic signals to obtain
information about the space around him. Sighted people have this ability
too; they simply don't need to use it in everyday situations," says Lutz
Wiegrebe. "Instead, the auditory system actively suppresses the
perception of echoes, allowing us to focus on the primary acoustic
signal, independently of how the space alters the signals on its way to
the ears." This makes it easier to distinguish between different sound
sources, allowing us to concentrate on what someone is saying to us, for
example. The new study shows, however, that it is possible to
functionally invert this suppression of echoes, and learn to use the
information they contain for echolocation instead.
In the absence of visual information, we and most other mammals find
navigation difficult. So it is not surprising that evolution has endowed
many mammalian species with the ability to "read" reflected sound waves.
Bats and toothed whales, which orient themselves in space primarily by
means of acoustic signals, are the best known.
http://phys.org/wire-news/139222561/echolocation-playing-it-by-ear.html
Smartphone-like device that maps surroundings to aid blind
A smartphone-like gadget that senses an entire room's features, builds a
virtual map of it and communicates this to the user may one day replace
the humble white cane to help blind sense their surroundings.
Using special multi-sensor array technology, the Indoor Navigation
Project will enable blind to sense their surroundings beyond the cane's
tip, researchers said.
Project leader Dr Iain Murray from the Curtin University said the gadget
would resemble a smartphone and would sense an entire room's features,
build a virtual map of it and communicate this to the user.
http://www.business-standard.com/article/pti-stories/smartphone-like-device-that-maps-surroundings-to-aid-blind-113072800411_1.html
More information about the ArtBeyondSightMuseums
mailing list