[AutonomousVehicles] Why the Feds Are Investigating Tesla's Autopilot and What that Means for the Future of Self-Driving Cars

Bill Meeker and Cheryl Orgas meekerorgas at ameritech.net
Tue Aug 24 14:24:55 UTC 2021


Colleagues,

 

Another article about Tesla.  They seem a long way from being ready for
blind driver prime time; unless that means putting my corpse in the
vehicle's driver's seat and setting its controls for the family plot.

 

Bill Meeker

 

 

Why the Feds Are Investigating Tesla's Autopilot and What that Means for the
Future of Self-Driving Cars

Hayder Radha,
Professor of Electrical and Computer Engineering, Michigan State University,
 <https://theconversation.com/us> The Conversation

AUGUST 23, 2021 02:00 PM ET

 


Tesla's Autopilot enables hands-free driving, but it's not meant to allow
drivers to take their eyes off the road.


 

But in at least 11 cases in the past three and a half years, Tesla's
Autopilot advanced driver-assistance system did just that. This led to
<https://static.nhtsa.gov/odi/inv/2021/INOA-PE21020-1893.PDF> 11 accidents
in which Teslas crashed into emergency vehicles or other vehicles at those
scenes, resulting in 17 injuries and one death.

The National Highway Transportation Safety Administration has
<https://www.caranddriver.com/news/a37320725/nhtsa-investigating-tesla-autop
ilot-crashes-fatalities/> launched an investigation into Tesla's Autopilot
system in response to the crashes. The incidents took place between January
2018 and July 2021 in Arizona, California, Connecticut,
<https://www.nytimes.com/2021/08/17/business/tesla-autopilot-accident.html>
Florida, Indiana, Massachusetts, Michigan, North Carolina and Texas. The
probe
<https://www.motortrend.com/news/nhtsa-tesla-autopilot-fsd-crash-investigati
on/> covers 765,000 Tesla cars - that's virtually every car the company has
made in the last seven years. It's also
<https://www.theverge.com/2020/2/25/21152984/tesla-autopilot-safety-recommen
dations-ignored-ntsb-crash-hearing> not the first time the federal
government has investigated Tesla's Autopilot.

As a  <https://scholar.google.com/citations?user=GJaAw1EAAAAJ&hl=en>
researcher who studies autonomous vehicles, I believe the investigation will
put pressure on Tesla to reevaluate the technologies the company uses in
Autopilot and could influence the future of driver-assistance systems and
autonomous vehicles.

How Tesla's Autopilot works

 <https://www.tesla.com/support/autopilot> Tesla's Autopilot uses cameras,
radar and ultrasonic sensors to support two major features: Traffic-Aware
Cruise Control and Autosteer.

Traffic-Aware Cruise Control, also known as adaptive cruise control,
maintains a safe distance between the car and other vehicles that are
driving ahead of it. This technology primarily uses cameras in conjunction
with artificial intelligence algorithms to detect surrounding objects such
as vehicles, pedestrians and cyclists, and estimate their distances.
Autosteer uses cameras to detect clearly marked lines on the road to keep
the vehicle within its lane.

In addition to its Autopilot capabilities, Tesla has been offering what it
calls "full self-driving" features that include
<https://www.youtube.com/watch?v=KeQm0L5UicM> autopark and
<https://www.youtube.com/watch?v=m0hfOZqf-PA> auto lane change. Since its
first offering of the Autopilot system and other self-driving features,
Tesla has consistently warned users that these technologies require active
driver supervision and that these features do not make the vehicle
autonomous.

 
<https://images.theconversation.com/files/417113/original/file-20210819-27-1
3xp50i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip> Screenshot of a
display with the left third showing an icon of a car on a road and the right
to third showing a mapTesla's Autopilot display shows the driver where the
car thinks it is in relation to the road and other vehicles.
<https://flickr.com/photos/rosenfeldmedia/50511890906/> Rosenfeld
Media/Flickr,  <http://creativecommons.org/licenses/by/4.0/> CC BY

Tesla is beefing up the AI technology that underpins Autopilot. The company
announced on Aug. 19, 2021, that it is
<https://www.cnbc.com/2021/08/19/tesla-unveils-dojo-d1-chip-at-ai-day.html>
building a supercomputer using custom chips. The supercomputer will help
train Tesla's AI system to recognize objects seen in video feeds collected
by cameras in the company's cars.

 

Autopilot does not equal autonomous

 

Advanced driver-assistance systems have been supported on a wide range of
vehicles for many decades. The Society of Automobile Engineers divides the
degree of a vehicle's automation into
<https://www.sae.org/standards/content/j3016_201806/> six levels, starting
from Level 0, with no automated driving features, to Level 5, which
represents full autonomous driving with no need for human intervention.

Within these six levels of autonomy, there is a clear and vivid divide
between Level 2 and Level 3. In principle, at Levels 0, 1 and 2, the vehicle
should be primarily controlled by a human driver, with some assistance from
driver-assistance systems. At Levels 3, 4 and 5, the vehicle's AI components
and related driver-assistance technologies are the primary controller of the
vehicle. For example, Waymo's
<https://theconversation.com/robot-take-the-wheel-waymo-has-launched-a-self-
driving-taxi-service-147908> self-driving taxis, which operate in the
Phoenix area, are Level 4, which means they operate without human drivers
but only under certain weather and traffic conditions.

 

Tesla Autopilot is considered a Level 2 system, and hence the primary
controller of the vehicle should be a human driver. This provides a partial
explanation for the incidents cited by the federal investigation. Though
Tesla says it expects drivers to be alert at all times when using the
Autopilot features, some drivers treat the Autopilot as having autonomous
driving capability with little or no need for human monitoring or
intervention. This discrepancy between Tesla's instructions and
<https://doi.org/10.1145/3409120.3410644> driver behavior seems to be a
factor in the incidents under investigation.

Another possible factor is how Tesla assures that drivers are paying
attention. Earlier versions of Tesla's Autopilot
<https://www.wsj.com/articles/tesla-considered-adding-eye-tracking-and-steer
ing-wheel-sensors-to-autopilot-system-1526302921?mod=e2tw> were ineffective
in monitoring driver attention and engagement level when the system is on.
The company instead relied on requiring drivers to periodically move the
steering wheel, which can be done without watching the road. Tesla recently
announced that it has begun using
<https://www.cnbc.com/2021/05/28/tesla-starts-using-cabin-cameras-for-driver
-monitoring.html> internal cameras to monitor drivers' attention and alert
drivers when they are inattentive.

 

Another equally important factor contributing to Tesla's vehicle crashes is
the company's choice of sensor technologies. Tesla has consistently
<https://venturebeat.com/2021/07/03/tesla-ai-chief-explains-why-self-driving
-cars-dont-need-lidar/> avoided the use of lidar. In simple terms,
<https://www.autoweek.com/news/a36190274/what-lidar-is/> lidar is like radar
but with lasers instead of radio waves. It's capable of precisely detecting
objects and estimating their distances. Virtually all major companies
working on autonomous vehicles, including Waymo, Cruise, Volvo, Mercedes,
Ford and GM, are using lidar as an essential technology for enabling
automated vehicles to perceive their environments.

By relying on cameras, Tesla's Autopilot is prone to potential failures
caused by challenging lighting conditions, such as glare and darkness. In
its announcement of the Tesla investigation, the NHTSA reported that most
incidents occurred after dark where there were flashing emergency vehicle
lights, flares or other lights. Lidar, in contrast, can operate under any
lighting conditions and can "see" in the dark.

Fallout from the investigation

The preliminary evaluation will determine whether the NHTSA should proceed
with an engineering analysis, which could lead to a recall. The
investigation could eventually lead to changes in future Tesla Autopilot and
its other self-driving system. The investigation might also indirectly have
a broader impact on the deployment of future autonomous vehicles; in
particular, the investigation may reinforce the need for lidar.

Although reports in May 2021 indicated that
<https://www.bloomberg.com/news/articles/2021-05-24/tesla-testing-luminar-la
ser-sensor-musk-called-fool-s-errand> Tesla was testing lidar sensors, it's
not clear whether the company was quietly considering the technology or
using it to validate their existing sensor systems. Tesla CEO Elon Musk
called lidar "
<https://techcrunch.com/2019/04/22/anyone-relying-on-lidar-is-doomed-elon-mu
sk-says/> a fool's errand" in 2019, saying it's expensive and unnecessary.

However, just as Tesla is revisiting systems that monitor driver attention,
the NHTSA investigation could push the company to consider adding lidar or
similar technologies to future vehicles.

Hayder Radha is a professor of electrical and computer engineering at the
Michigan State University.

The ConversationThis article is republished from
<https://theconversation.com/> The Conversation under a Creative Commons
license. Read the
<https://theconversation.com/why-the-feds-are-investigating-teslas-autopilot
-and-what-that-means-for-the-future-of-self-driving-cars-166307> original
article.

 

 

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nfbnet.org/pipermail/autonomousvehicles_nfbnet.org/attachments/20210824/61780f69/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.jpg
Type: image/jpeg
Size: 37362 bytes
Desc: not available
URL: <http://nfbnet.org/pipermail/autonomousvehicles_nfbnet.org/attachments/20210824/61780f69/attachment.jpg>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image002.gif
Type: image/gif
Size: 43 bytes
Desc: not available
URL: <http://nfbnet.org/pipermail/autonomousvehicles_nfbnet.org/attachments/20210824/61780f69/attachment.gif>


More information about the AutonomousVehicles mailing list