[Ohio-Talk] Fwd: [ossb] Visually impaired, autonomous vehicles and an assistance app.

President Capital chapter president.capital.nfboh at gmail.com
Sun Feb 7 21:23:33 UTC 2021


Hello,

I saw this on another chat list, and thought it might be of interest to everybody. Sounds pretty cool.

Annette

Annette Lutz
President
Capital Chapter of the National Federation of the Blind of Ohio
614-288-4323
President.capital.nfboh at gmail.com

The National Federation of the Blind knows that blindness is not the characteristic that defines you or your future. Everyday we raise the expectations of blind people because we know that low expectations create obstacles between blind people and our dreams. You can live the life you want, blindness is not what holds you back. 

Begin forwarded message:

From: Tracy Duffy <tlduffy1962 at gmail.com>
Date: February 7, 2021 at 4:13:39 PM EST
Subject: [ossb] Visually impaired, autonomous vehicles and an assistance app.
Reply-To: ossb at groups.io


Visually impaired accessible technology - BingNews - Friday, January 29, 2021 at 4:18 PM

App will help visually impaired, seniors enjoy ride-sharing with self-driving cars



Self-driving cars will offer access to ride-sharing and ride-hailing with their suite of modern conveniences. However, many people with visual impairments who use these services rely on a human driver to safely locate their vehicle.

A research group led by the Virtual Environments and Multimodal Interaction Laboratory (VEMI Lab) at the University of Maine is developing a smartphone app that provides the navigational assistance needed for people with disabilities and seniors to enjoy ride-sharing and ride-hailing, collectively termed mobility-as-a-service, with the latest in automotive technology. The app, known as the Autonomous Vehicle Assistant (AVA), can also be used for standard vehicles operated by human drivers and enjoyed by everyone.

AVA will help users request, find and enter a vehicle using a multisensory interface that provides guidance through audio and haptic feedback and high-contrast visual cues. The Autonomous Vehicle Research Group (AVRG), a cross institutional collective led by VEMI lab with researchers from Northeastern University and Colby College, will leverage GPS technology, real-time computer vision via the smartphone camera and artificial intelligence to support the functions offered through the app.

The U.S. Department of Transportation awarded $300,000 to AVRG for the AVA project through its Inclusive Design Challenge. The initiative sought proposals for design solutions that would help people with disabilities use autonomous vehicles for employment and essential services. AVRG was one of the semifinalists. 

"This design challenge was exciting to us as it falls so squarely in our wheelhouse" says Nicholas Giudice, a professor of spatial Computing at UMaine. "We have worked in the areas of multimodal information access and navigation for visually impaired people and older adults for years, and have recently started a research program investigating human-vehicle collaborations for increasing the trustworthiness and accessibility of autonomous vehicles. This development project connects the dots by allowing us to bridge several areas of expertise to ensure that the technology of the future is 'accessible for all.'"

Users will create a profile in AVA that reflects their needs and existing methods of navigation. The app will use the information from their profiles to find a suitable vehicle for transport, then determine whether one is available.

When the vehicle arrives, AVA will guide the user to it using the camera and augmented reality (AR), which provides an overlay of the environment using the smartphone by superimposing high-contrast lines over the image to highlight the path and verbal guidance, such as compass directions, street names, addresses and nearby landmarks. The app also will pinpoint environmental hazards, such as low-contrast curbs, by emphasizing them with contrasting lines and vibrating when users approach them. It will then help users find the door handle to enter the vehicle awaiting them. 

"This is the first project of its kind in the country, and in combination with our other work in this area, we are addressing an end-to-end solution for AVs (autonomous vehicles) that will improve their accessibility for all," says Giudice, chief research scientist at VEMI Lab and lead on the AVA project. "Most work in this area only deals with sighted passengers, yet the under-represented driving populations we are supporting stand to benefit most from this technology and are one of the fastest growing demographics in the country." 

AVRG studies how autonomous vehicles can meet various accessibility needs. VEMI lab itself has explored tactics for improving consumer trust in this emerging technology.

AVA advances both groups' endeavors by not only providing another means for people with visual impairments and other disabilities and seniors to access self-driving vehicles, but also increases their trust in them. The project also builds on a seed grant-funded, joint effort between UMaine and Northeastern University to improve accessibility, safety and situational awareness within the self-driving vehicle. Researchers from both universities aim to develop a new model of human-AI vehicle interaction to ensure people with visual impairments and seniors understand what the autonomous vehicle is doing and that it can sense, interpret and communicate with the passenger.

The app will offer modules that train users how to order and locate rides, particularly through mock pickup scenarios. Offering hands-on learning provides users confidence in themselves and the technology, according to researchers. It also gathers data AVRG can use during its iterative, ongoing development for AVA and its integration into autonomous vehicles.

"We are very excited about this opportunity to create accessible technology which will help the transition to fully autonomous vehicles for all. The freedom and independence of all travelers is imperative as we move forward," says VEMI lab director Richard Corey. 

###

VEMI Lab, co-founded by Corey and Giudice in 2008, explores different solutions for solving unmet challenges with technology. Prime areas of research and development pertain to self-driving vehicles, the design of bio-inspired tools to improve human-machine interaction and functionality, and new technology to improve environmental awareness, spatial learning and navigational wayfinding.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.


http://www.bing.com/news/apiclick.aspx?ref=FexRss&aid=&tid=F9B1D36A3B7243C399B4247FC133B5E8&url=http%3a%2f%2fwww.eurekalert.org%2fpub_releases%2f2021-01%2fuom-awh012921.php&c=291640283433761756&mkt=en-us




_._,_._,_
Groups.io Links:
You receive all messages sent to this group. 
You automatically follow any topics you start or reply to.

View/Reply Online (#377) | Reply To Group | Reply To Sender | Mute This Topic | New Topic
Your Subscription | Contact Group Owner | Unsubscribe [annettelutz at icloud.com]

_._,_._,_


More information about the Ohio-Talk mailing list