[AutonomousVehicles] Waymo's Self-Driving Car Crashed Because its Human Driver Fell Asleep at the Wheel

Cheryl Orgas & William Meeker meekerorgas at ameritech.net
Thu Oct 4 18:20:35 UTC 2018


Colleagues,

 

In addition to blind pedestrians becoming Autonomous Vehicle Crash Victims,
I fear the liability and negative publicity stereotyping that blind drivers
and all blind people, will face when the vehicle they drive,  becomes
involved in any type of collision, regardless of cause or fault.
Fortunately, all crashes to date involve sighted drivers, safety drivers,
and victims, which carry no stigma for the blind or the sighted.  As the
below article states "One thing everyone working on driverless cars agrees
on is that humans are bad drivers."

 

But at the first crash involving a blind driver, the above statement will be
forgotten and the thought will be "Blind humans are worse drivers."

 

I believe blind individuals and constituency organizations must be prepared
for the blowback the first time a blind driver's vehicle crashes into
anything at any time, under any conditions.  Have we discussed this issue
yet?

 

Donning my "Future Autonomous Vehicle Crash Victims of America" T-shirt with
hope,

 

Bill Meeker

 

 

 

Waymo's Self-Driving Car Crashed Because its Human Driver Fell Asleep at the
Wheel

By Alison Griswold,
 <http://qz.com/> Quartz 

| October 3, 2018 

 

The dozing driver didn't respond to any of the vehicle's warnings.

 

In June, one of Waymo's self-driving Chrysler Pacifica minivans crashed on
the freeway outside of the company's office in Mountain View, California,
after its lone safety driver fell asleep at the wheel.

Tech news site The Information, which first reported the crash
<https://www.theinformation.com/articles/waymo-collision-shows-flaws-in-self
-driving-car-tests?shared=82d7c8e62c2bdd94> , said the human driver manning
the vehicle "appeared to doze off" after about an hour on the road,
according to two people familiar with the matter. The safety driver
unwittingly turned off the car's self-driving software by touching the gas
pedal. He failed to assume control of the steering wheel, and the Pacifica
crashed into the highway median.

The dozing driver didn't respond to any of the vehicle's warnings, including
a bell signaling the car was in manual mode and another audio alert, the
Information reported. He regained alertness once the car crashed, then
turned around and headed back to the Mountain View office. He no longer
works for Waymo.

Waymo got lucky with the accident. The safety driver wasn't hurt and no
other vehicles were involved. Waymo reported the vehicle sustained "moderate
damage to its tire and bumper." The company told The Information in a
statement that it is "constantly improving our best practices, including
those for driver attentiveness, because the safe and responsible testing of
our technology is integral to everything we do."

Improvements in this case meant altering night-shift protocol to have two
safety drivers instead of one, to guard against someone nodding off at the
wheel. At a company meeting to discuss the incident, one attendee reportedly
asked whether safety drivers were on the road too long, and was told that
drivers can take a break whenever they need to.

Waymo is pursuing fully self-driving software that wouldn't require any
intervention from humans, in contrast to automakers like Tesla and General
Motors, which have started with selectively automated features to assist
human drivers. As Waymo has gotten closer to true autonomy, it has also
tried to reduce its reliance on human safety drivers by, for example,
cutting the number of safety drivers in a test vehicle to one from two.
Waymo plans to launch a commercial ride-hail service with driverless cars
<https://qz.com/1208897/alphabets-waymo-googl-is-readying-a-ride-hailing-ser
vice-in-arizona-that-could-directly-compete-with-uber/>  in the Phoenix area
this year.

 

After a self-driving Uber struck and killed a pedestrian in Tempe, Arizona
in March, one point of focus was Uber's safety-driver policies. Jalopnik
pointed out
<https://jalopnik.com/just-about-everyone-uses-two-safety-drivers-when-testi
n-1823984330>  that "almost everyone"-Toyota, Nissan, Ford's Argo AI-uses
two people to test self-driving cars. In the Uber Volvo that crashed, on the
other hand, Rafaela Vasquez was a lone safety driver, at night. She was
later found by police to be streaming
<https://www.reuters.com/article/us-uber-selfdriving-crash/uber-cars-safety-
driver-streamed-tv-show-before-fatal-crash-police-idUSKBN1JI0LB>  The Voice
on her phone at the time of impact.

One thing everyone working on driverless cars agrees on is that humans are
bad drivers. People from Waymo CEO John Krafcik to disgraced former Uber
engineer Anthony Levandowski-try finding a more diametrically opposed
pair-like to talk about how driverless cars will save lives by eliminating
thousands of preventable highway fatalities
<https://www.consumerreports.org/autonomous-driving/faster-rollout-self-driv
ing-cars-would-save-lives/>  a year.

It is baffling, then, that these companies trust the very humans they seek
to unseat to watch over their adolescent technology, alone and for hours on
end. An autonomous safety driver once described to me working 10- to 11-hour
shifts unaccompanied, including nights that began in the early evening and
ended well past midnight. Drivers could take breaks whenever they wanted,
this person said, but it was still a challenge to stay focused for that long
without anyone to talk to, or much to do beyond watching the road.

A few months after the Tempe accident, Uber laid off most of its
self-driving car operators
<https://qz.com/1326155/uber-has-terminated-its-self-driving-car-operators-i
n-pittsburgh/>  in Pittsburgh and San Francisco. Uber said it would replace
these people with "mission specialists" trained to monitor its cars on roads
and on specialized test tracks. These mission specialists are supposed to be
more involved in the actual development of the cars, tasked with tracking,
documenting, and triaging any issues that might crop up. Per a current job
listing
<https://www.uber.com/careers/list/34593/?iis=uber.com/careers&iisp=he-65940
79> , they should have "the ability to operate independently with little or
no supervision."

There is a great essay by reporter Tim Harford
<https://www.theguardian.com/technology/2016/oct/11/crash-how-computers-are-
setting-us-up-disaster>  about how our quest to automate all things may be
setting us up for disaster. The more we let computers fly planes, drive
cars, operate machinery, and so on, the less time the people we've put in
place for backup-pilots, safety drivers, and other operators-are able to
practice their skills, and the greater the odds they'll be unprepared in a
true emergency. This problem is known as the paradox of automation, and it
applies to benign problems as well, like how we struggle to remember phone
numbers that are stored in our mobile devices, or to do mental math that we
can punch into a calculator. Like any skill, these need to be practiced to
be maintained, and become rusty with disuse. Instead of designing technology
for humans to babysit, Harford wonders, why aren't we making technology that
babysits humans? 

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nfbnet.org/pipermail/autonomousvehicles_nfbnet.org/attachments/20181004/acd3c719/attachment.html>


More information about the AutonomousVehicles mailing list