The investigation covers 765,000 vehicles, almost everything
that Tesla has sold in the US since the start of the 2014 model year. Of the
crashes identified by the National Highway Traffic Safety Administration as
part of the probe, 17 people were injured and one was killed.
NHTSA says it has identified 11 crashes since 2018 in which
Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes
where first responders have used flashing lights, flares, an illuminated arrow
board or cones warning of hazards. The agency announced the action Monday in a
posting on its website.
The probe is another sign that NHTSA under President Joe
Biden is taking a tougher stance on automated vehicle safety than under
previous administrations. Previously the agency was reluctant to regulate the
new technology for fear of hampering adoption of the potentially life-saving
systems.
The investigation covers Tesla's entire current model
lineup, the Models Y, X, S and 3 from the 2014 through 2021 model years.
The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot's use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.
Tesla Drops Radar; Is Autopilot System Safe?
“Today's action by NHTSA is a positive step forward for safety,” NTSB Chair Jennifer L. Homendy said in a statement Monday. “As we navigate the emerging world of advanced driving assistance systems, it's important that NHTSA has insight into what these vehicles can, and cannot, do.”
Last year the NTSB blamed Tesla, drivers and lax regulation
by NHTSA for two collisions in which Teslas crashed beneath crossing
tractor-trailers. The NTSB took the unusual step of accusing NHTSA of
contributing to the crash for failing to make sure automakers put safeguards in
place to limit use of electronic driving systems.
The agency made the determinations after investigating a
2019 crash in Delray Beach, Florida, in which the 50-year-old driver of a Tesla
Model 3 was killed. The car was driving on Autopilot when neither the driver
nor the Autopilot system braked or tried to avoid a tractor-trailer crossing in
its path.
“We are glad to see NHTSA finally acknowledge our long
standing call to investigate Tesla for putting technology on the road that will
be foreseeably misused in a way that is leading to crashes, injuries, and
deaths,” said Jason Levine, executive director of the nonprofit Center for Auto
Safety, an advocacy group. “If anything, this probe needs to go far beyond
crashes involving first responder vehicles because the danger is to all
drivers, passengers, and pedestrians when Autopilot is engaged.”
Autopilot has frequently been misused by Tesla drivers, who
have been caught driving drunk or even riding in the back seat while a car
rolled down a California highway.
A message was left seeking comment from Tesla, which has
disbanded its media relations office. Shares of Tesla, based in Palo Alto,
California, fell 4.3 percent Monday.
NHTSA has sent investigative teams to 31 crashes involving
partially automated driver assist systems since June of 2016. Such systems can
keep a vehicle centered in its lane and a safe distance from vehicles in front
of it. Of those crashes, 25 involved Tesla Autopilot in which 10 deaths were
reported, according to data released by the agency.
Tesla and other manufacturers warn that drivers using the
systems must be ready to intervene at all times. In addition to crossing semis,
Teslas using Autopilot have crashed into stopped emergency vehicles and a
roadway barrier.
The probe by NHTSA is long overdue, said Raj Rajkumar, an
electrical and computer engineering professor at Carnegie Mellon University who
studies automated vehicles.
Tesla's failure to effectively monitor drivers to make sure
they're paying attention should be the top priority in the probe, Rajkumar
said. Teslas detect pressure on the steering wheel to make sure drivers are
engaged, but drivers often fool the system.
“It's very easy to bypass the steering pressure thing,”
Rajkumar said. “It's been going on since 2014. We have been discussing this for
a long time now.”
The crashes into emergency vehicles cited by NHTSA began on
January 22, 2018 in Culver City, California, near Los Angeles when a Tesla
using Autopilot struck a parked firetruck that was partially in the travel
lanes with its lights flashing. Crews were handling another crash at the time.
Since then, the agency said there were crashes in Laguna
Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater,
Massachusetts; Cochise County, Arizona; Charlotte, North Carolina; Montgomery
County, Texas; Lansing, Michigan; and Miami, Florida.
“The investigation will assess the technologies and methods
used to monitor, assist and enforce the driver's engagement with the dynamic
driving task during Autopilot operation,” NHTSA said in its investigation
documents.
In addition, the probe will cover object and event detection
by the system, as well as where it is allowed to operate. NHTSA says it will
examine “contributing circumstances” to the crashes, as well as similar
crashes.
An investigation could lead to a recall or other enforcement
action by NHTSA.
“NHTSA reminds the public that no commercially available
motor vehicles today are capable of driving themselves,” the agency said in a
statement. “Every available vehicle requires a human driver to be in control at
all times, and all state laws hold human drivers responsible for operation of
their vehicles.”
The agency said it has “robust enforcement tools” to protect
the public and investigate potential safety issues, and it will act when it
finds evidence “of noncompliance or an unreasonable risk to safety.”
In June, NHTSA ordered all automakers to report any crashes
involving fully autonomous vehicles or partially automated driver assist
systems.
Tesla uses a camera-based system, a lot of computing power,
and sometimes radar to spot obstacles, determine what they are, and then decide
what the vehicles should do. But Carnegie Mellon's Rajkumar said the company's
radar was plagued by “false positive” signals and would stop cars after
determining overpasses were obstacles.
Now Tesla has eliminated radar in favour of cameras and
thousands of images that the computer neural network uses to determine if there
are objects in the way. The system, he said, does a very good job on most
objects that would be seen in the real world. But it has had trouble with
parked emergency vehicles and perpendicular trucks in its path.
“It can only find patterns that it has been quote-unquote
trained on,” Rajkumar said. “Clearly the inputs that the neural network was
trained on just do not contain enough images. They're only as good as the
inputs and training. Almost by definition, the training will never be good
enough.”
Tesla also is allowing selected owners to test what it calls
a “full self-driving” system. Rajkumar said that should be investigated as
well.
0 comments:
Post a Comment