<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40"><head><meta http-equiv=Content-Type content="text/html; charset=us-ascii"><meta name=Generator content="Microsoft Word 14 (filtered medium)"><style><!--
/* Font Definitions */
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
@font-face
{font-family:Roboto;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri","sans-serif";}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:purple;
text-decoration:underline;}
p
{mso-style-priority:99;
margin-top:0in;
margin-right:0in;
mso-margin-bottom-alt:auto;
margin-left:0in;
font-size:12.0pt;
font-family:"Times New Roman","serif";}
span.EmailStyle17
{mso-style-type:personal-compose;
font-family:"Calibri","sans-serif";
color:windowtext;}
p.content-byline, li.content-byline, div.content-byline
{mso-style-name:content-byline;
mso-style-priority:99;
margin:0in;
margin-bottom:.0001pt;
font-size:12.0pt;
font-family:"Times New Roman","serif";}
span.authors-multiple2
{mso-style-name:authors-multiple2;
color:#71BF44;
text-transform:uppercase;
border:none windowtext 1.0pt;
padding:0in;
text-decoration:none none;}
span.author-organization3
{mso-style-name:author-organization3;
color:black;
font-weight:normal;}
span.author-organization-sep2
{mso-style-name:author-organization-sep2;
display:none;}
span.content-byline-date-sep2
{mso-style-name:content-byline-date-sep2;
display:none;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri","sans-serif";}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]--></head><body lang=EN-US link=blue vlink=purple><div class=WordSection1><p class=MsoNormal>Fellow Committee Members,<o:p></o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal>FYI about a work in progress. As a sighted “driver” was in the vehicle at the time of the fatal crash the consequences are less catastrophic as they would have been with a blind “driver.”<o:p></o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal>In my opinion it is encumbent upon us blind pedestrians to listen defensively.<o:p></o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal>William Meeker<o:p></o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><b><span lang=EN style='font-size:15.0pt;color:black'>Uber’s Self-Driving Car Didn’t Malfunction, It Was Just Bad</span></b><o:p></o:p></p><p class=content-byline><b><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#999999'>By </span></b><span class=authors-multiple2><b><span lang=EN style='font-size:7.5pt;font-family:Roboto'><a href="https://www.nextgov.com/voices/alexis-madrigal/6700/?oref=ng-post-author?oref=rf-post-author"><span style='color:#71BF44'>Alexis Madrigal</span></a></span></b></span><span class=author-organization-sep2><span lang=EN style='font-size:7.5pt;font-family:Roboto'>,</span></span><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:black'><br><span class=author-organization3><a href="http://www.theatlantic.com/" target="_blank"><span style='color:black'>The Atlantic</span></a></span></span><b><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#999999'> <o:p></o:p></span></b></p><p class=MsoNormal><span class=content-byline-date-sep2><b><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#999999'>|</span></b></span><b><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#999999'> May 29, 2018 01:00 PM ET</span></b><o:p></o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><span lang=EN style='font-family:Roboto;color:black'>There were no software glitches or sensor breakdowns that led to a fatal crash, merely poor object recognition, emergency planning, system design, testing methodology, and human operation.</span><o:p></o:p></p><p class=MsoNormal><o:p> </o:p></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>On March 18, at 9:58 p.m., a self-driving Uber car <a href="https://www.theatlantic.com/technology/archive/2018/03/can-you-sue-a-robocar/556007/" target="_blank">killed</a> Elaine Herzberg. The vehicle was driving itself down an uncomplicated road in suburban Tempe, Arizona, when it hit her. Herzberg, who was walking across the mostly empty street, was the first pedestrian killed by an autonomous vehicle.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>The <a href="https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf" target="_blank">preliminary National Transportation Safety Board</a> report on the incident, released on Thursday, shows that Herzberg died because of a cascading series of errors, human and machine, which present a damning portrait of Uber’s self-driving testing practices at the time.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>Perhaps the worst part of the report is that Uber’s system functioned as designed. There were no software glitches or sensor malfunctions. It just didn’t work very well.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>According to the report, the object-detection system misclassified Herzberg when its sensors first detected her “as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.” That led the planning software to make poor predictions for her speed and direction, as well as its own speed and direction.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>1.3 seconds before the impact, the self-driving computer realized that it needed to make an emergency-braking maneuver to avoid a collision. But it did not. Why? Uber’s software prevented its system from hitting the brakes if that action was expected to cause a deceleration of faster than 6.5 meters per second. That is to say, in an emergency, the computer could not brake.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior,” the report says.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>Instead, the system relied on the driver to take control in an emergency, but “the system is not designed to alert the operator.”<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>The driver, for her part, took control of the car less than 1 second before the crash by grabbing the steering wheel. It wasn’t until after impact that she began braking.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>In video footage of the interior of the car leading up to the crash, the driver is repeatedly seen looking down toward the center console of the car. Many commentators assumed that she was looking at a phone, but she told the NTSB investigators “she had been monitoring the self-driving system interface.” In fact, the testing method requires operators “monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.”<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>Other self-driving companies’ testing protocols involve two people: one to drive and the other to monitor the system’s outputs and do the computer work. Uber itself did this too until late 2017, when the company decided that the second operator’s job could be done by looking at logs back at the office. “We decided to make this transition because after testing, we felt we could accomplish the task of the second person—annotating each intervention with information about what was happening around the car—by looking at our logs after the vehicle had returned to base, rather than in real time,” <a href="https://www.citylab.com/transportation/2018/03/former-uber-backup-driver-we-saw-this-coming/556427/" target="_blank">an Uber spokeswoman told CityLab</a> earlier this year.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>It’s unclear what penalties Uber could face for this failure. The company has <a href="https://www.ft.com/content/1d7f174a-3362-11e8-b5bf-23cb17fd1498" target="_blank">already settled a court case</a> with Herzberg’s family. It has also scaled back its autonomous-testing efforts.<o:p></o:p></span></p><p><span lang=EN style='font-size:7.5pt;font-family:Roboto;color:#292B2C'>“Over the course of the last two months, we’ve worked closely with the NTSB. As their investigation continues, we’ve initiated our own safety review of our self-driving vehicles program,” Uber said in an emailed statement. “We’ve also brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture, and we look forward to sharing more on the changes we’ll make in the coming weeks.” <o:p></o:p></span></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><o:p> </o:p></p></div></body></html>