$TSLA - Report: Tesla’s new Autopilot feature is dumber than human drivers
By that standard, a new version of
Tesla’s Navigate on Autopilot self-driving system is a big step back. That’s
the opinion of Consumer Reports, which road-tested a new Autopilot feature
allowing Tesla vehicles to make lane changes automatically, without the
participation of the driver.
Tesla says the feature makes for “a more seamless active guidance experience.”
Consumer Reports begs to disagree. In a review posted Wednesday, it says “it
observed the opposite in its own tests of the feature, finding that it doesn’t
work very well and could create potential safety risks for drivers.”
The feature aims to give drivers
one more bit of respite from managing their cars’ operations by allowing the
cars to initiate a lane change without requiring driver confirmation. But in
CR’s experience, “the feature cut off cars without leaving enough space and
even passed other cars in ways that violate state laws.… As a result, the
driver often had to prevent the system from making poor decisions.”
We’ve asked Tesla to comment on the
report, and will update this post if the company does so. Tesla did, however,
point CR to its April 3 blog post announcing the new feature, in which it
stated, “more than 9 million suggested lane changes have been successfully
executed with the feature in use.” Tesla drivers, the post said, have traveled
more than 66 million miles using the Navigate on Autopilot system, which aims
to guide the vehicle in highway driving “from on-ramp to off-ramp.” Tesla argues
that so much experience tends to “support the validation of Navigate on
Autopilot.”
Statistically speaking, however,
that’s a flawed argument. Tesla’s 66 million miles of experience, much less its
9 million lane changes, aren’t remotely enough to validate any safety claims
that purport to compare to the experience of human drivers. As the
Rand Corp. observed in 2016, American motorists drive an average of
3 trillion miles per year and fatalities are relatively rare — the 32,800
deaths annually on U.S. roads amount to only 1.09 per 100 million miles.
Establishing to a statistical
near-certainty that driverless cars would reduce vehicular fatalities by even
20% would require 5 billion miles of road testing — a record that would take a
fleet of 100 test vehicles 225 years to complete if they operated at an average
of 25 miles per hour, 24 hours a day and 365 days a year. In other words, a
sample of 66 million miles proves nothing.
CR’s experience is another data
point suggesting that the gap between today’s driver-assistance features — such
as automatic braking on cruise control and assisted lane changing — and fully
automated driving without driver participation may be much greater than is
estimated by promoters of autonomous vehicles.
There are signs that some Autopilot
users may vest the system with more control over their vehicles than is wise:
Autopilot was engaged during three fatal Tesla crashes, including on March 1, when
a driver was killed in a collision with a semitrailer. There are no indications
that either the driver or the Autopilot system took action to avoid the
trailer, according to the National Transportation Safety Board.
Tesla cautions that “drivers should
always be attentive when using Autopilot”; its standard system has required
drivers to confirm lane changes via the turn signal stalk and in some cases
requires drivers to have their hands on the steering wheel.
Yet the company’s own terminology hints at a rather more
freewheeling approach. Its automated lane-change function can be set to four
levels — Disabled, Mild, Average or “Mad Max.”
The mild setting allows lane
changes when the car is traveling at a speed significantly slower than the
cruise control setting; “Mad Max” when the car is just a bit slower than the
cruise control speed. That’s not necessarily imprudently fast, but calling it
“Mad Max” conjures up the wild-eyed dune-buggy-riding speed demon of the
eponymous movie series — not a character one would wish to share a highway
with.
CR’s findings indicate that giving
Teslas the authority to make lane changes without driver participation — that
is, without confirmation via the turn-signal stalk — may be premature. Its
testers reported that their vehicles “often changed lanes in ways that a safe
human driver would not — cutting too closely in front of other cars, and
passing on the right.”
The magazine specifically
challenged Tesla’s assertion that its vehicles’ three rear-facing cameras could
detect fast-approaching objects from the rear better than the average driver.
In practice, the system had trouble detecting vehicles approaching from behind
at high speed: “Because of this, the system will often cut off a vehicle that
is going a much faster speed since it doesn’t seem to sense the oncoming car
until it’s relatively close.”
In several cases, CR says, the
Teslas passed cars on the right on a two-lane divided highway. In Connecticut,
where the testing took place, that’s illegal and would get the driver ticketed.
In short, while Tesla says its
Navigate system aims to make highway driving “more relaxing, enjoyable and
fun,” CR found that the lane-change function made the driving experience more
stressful.
“Tesla is showing what not to do on
the path toward self-driving cars,” said David Friedman, CR’s vice president of
advocacy: “release increasingly automated driving systems that aren’t vetted
properly.”
Post a Comment