[AutonomousVehicles] Crash of Tesla using new Full Self Driving Mode Beta

Cornelius Butler corn at butlernewmedia.com
Tue Nov 16 09:09:05 UTC 2021


Hi Everyone,
There has been a crash involving a Tesla using the new Full Self Driving
Mode Beta. More info below with the article link and text.

Article Link:
https://www.theverge.com/2021/11/12/22778135/tesla-full-self-driving-beta-crash-fsd-california

Article Text:
Tesla vehicle in ‘Full Self-Driving’ beta mode ‘severely damaged’ after
crash in California

By Andrew J. Hawkins at andyjayhawk  Nov 12, 2021, 10:09am EST

A Tesla Model Y in “Full Self-Driving” (FSD) beta mode allegedly crashed on
November 3rd in Brea, a city southeast of Los Angeles, marking what is
likely to be the first incident involving the company’s controversial
driver assist feature. No one was injured in the crash, but the vehicle was
reportedly “severely damaged.”

The crash was reported to the National Highway Traffic Safety
Administration, which has multiple, overlapping investigations into Tesla’s
Autopilot system. The incident report appears to have been made by the
owner of the Model Y. A spokesperson for NHTSA did not immediately respond
to a request for comment.

According to the report:

The Vehicle was in FSD Beta mode and while taking a left turn the car went
into the wrong lane and I was hit by another driver in the lane next to my
lane. the car gave an alert 1/2 way through the turn so I tried to turn the
wheel to avoid it from going into the wrong lane but the car by itself took
control and forced itself into the incorrect lane creating an unsafe
maneuver putting everyone involved at risk. car is severely damaged on the
driver side.

Tesla’s decision to test its “Full Self Driving” driver assistance software
with untrained vehicle owners on public roads has attracted a massive
amount of scrutiny and criticism. Throughout, the company has rolled out —
and retracted — several software updates meant to upgrade the system while
also addressing bugs in the software.

There have been many video clips uploaded online showing Tesla owners using
FSD beta, with varying degrees of success. Some clips show the driver
assist system confidently handling complex driving scenarios, while others
depict the car drifting into the wrong lane or making other serious
mistakes.

Despite its name, FSD is not an autonomous driving system. Drivers are
required to stay vigilant, keeping their eyes on the road and their hands
on the steering wheel. Vehicles with highly automated driving systems that
still require human supervision are classified as Level 2 under the Society
of Automotive Engineers’ taxonomy. (Level 5 describes a vehicle that can
drive anywhere, under any conditions, without any human supervision.)

The US government has taken a renewed interest in Tesla, recently
announcing that it was investigating incidents involving Tesla cars
operating Autopilot that have crashed into parked emergency vehicles.

NHTSA is also seeking more information from Tesla about the growing public
beta test of FSD, the recently launched “Safety Score” evaluation process
for entering the program, and the nondisclosure agreements Tesla was making
participants sign up until recently.

A spokesperson for Tesla did not respond to a request for comment — nor is
it likely they will after disbanding their press department in 2019.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nfbnet.org/pipermail/autonomousvehicles_nfbnet.org/attachments/20211116/44c68e78/attachment.html>


More information about the AutonomousVehicles mailing list