Therese Poletti’s Tech Tales: It’s time for Elon Musk to start telling the truth about autonomous driving

This post was originally published on this site

An earlier version of this article referred to the National Transportation Safety Board when the source quoted meant to reference the National Highway Traffic Safety Administration. The article has been updated.

Tesla Chief Executive Officer Elon Musk has shown that he has an influential platform by roiling the cryptocurrency market with his tweets. It is time for him to use it for a more important purpose: Telling the truth about autonomous driving.

Tesla Inc.
TSLA,
-0.89%

has been charging customers up to $10,000 for “full self-driving” technology for nearly five years. The problem is that such technology does not exist. The company’s cars are equipped with a driver-assistance system, or ADAS, known as Autopilot, which is free of charge. It’s similar to systems offered by other manufacturers, such as General Motors Co.’s
GM,
-0.77%

Super Cruise.

In that time, Musk has provided overly optimistic timelines to full autonomy and exaggerated the current and near-term capabilities of Autopilot. Tesla’s ADAS system lacks technology that most of the industry considers critical to safety, namely a driver-monitoring system. Musk predicted that a Tesla would be able to drive autonomously from Los Angeles to New York in 2018, a trip that still has not happened, and in 2019, he said all Teslas would be fully functioning robotaxis in the near future, which is unlikely, if not impossible.

Musk’s hyperbole is nothing new, and is not unique to him, but when it comes to autonomous driving, the consequences can be dire. Tesla fans have latched on to Musk’s words instead of the warnings in their owner’s manuals, and publicly performed dangerous stunts like sitting in the back seat of their cars as they operate on Autopilot. Recently, a Tesla owner who had posted videos of himself using Autopilot in an unsafe manner died in an automobile accident in California that is being investigated.

Tesla says its $10,000 computer for “self-driving capabilities” improves with software updates. The fine print on Tesla’s website, however, says its cars come with “features [that] require active driver supervision and do not make the vehicle autonomous.” The disclaimer says further that “[t]he activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions.”

State and federal reviews

The pattern of driver misuse, collisions in which Autopilot may have been involved, and Musk’s blasé approach to autonomy has led to renewed scrutiny. Last week, the Los Angeles Times reported that the state of California’s Department of Motor Vehicles is reviewing Tesla to determine whether the company misleads consumers by advertising that its cars have full self-driving capability. While playing up its potential for Level 5, or full autonomy, Tesla has told the California DMV privately that its system is only level 2, or ADAS. Tesla has already been called out for the practice in Germany, where a court ruled that Tesla has misled consumers about autonomy and banned the company from using certain language.

A DMV spokeswoman confirmed that the matter is under review in California. While she said the DMV does not comment on items under review, she did provide a general statement: “The regulation prohibits a company from advertising vehicles for sale or lease as autonomous unless the vehicle meets the statutory and regulatory definition of an autonomous vehicle and the company holds a deployment permit.”

There are also federal investigations. The National Highway Traffic Safety Administration, or NHTSA, has launched 34 investigations related to advanced driver assistance systems, and 28 of those involve Tesla vehicles, according to an NHTSA spokeswoman. Recent Tesla collisions in Texas, California and China have only increased the scrutiny.

Tesla has even drawn attention in the halls of Congress, with three senators recently introducing legislation to require driver-monitoring systems in cars with ADAS. These systems ensure that the driver is in the driver’s seat and paying attention to the road, and are a feature in most cars with strong ADAS systems — but not Teslas.

While fans post videos on media platforms of dangerous maneuvers — such as a man in the Bay Area of California who was arrested for riding in the back seat, then got out of jail, purchased a new Tesla and did it again for a news crew — Tesla is also now a party to at least 16 lawsuits filed across the U.S. in the past two years in which plaintiffs allege malfunction or problems with Tesla Autopilot, according to data gathered by PlainSite.

None of the cases seek class-action status, and all have been filed by individuals, contending dangerous malfunctioning of the software. Tesla has not outlined the suits in its regulatory filings.

Vicki Bryan, CEO of the Bond Angle, says she expects heightened regulatory scrutiny of Tesla and its claims under the new Biden administration. She noted that under the Trump administration, the NHTSA was practically gutted, resulting in toothless enforcement.

“We have a new sheriff in town,” Bryan said. “California is one thing, and California will be working in concert with enhanced federal scrutiny.”

The ‘Autonowashing’ problem

Despite the growing concern, Musk continues to oversell the capabilities and near-term potential of Autopilot and “full-self driving.” Earlier this year he said his company’s outsize valuation — it’s by far the most valuable auto maker in the world — will be validated by the technology. The rest of the industry used to make similar claims about the path to level 5 autonomy — which is when cars will be able to drive anywhere by themselves, with nobody in the driver’s seat — but other car executives have made the switch from sci-fi visions of passenger cars driving themselves to pick up riders to speaking of actual reality.

For example, Nvidia Corp.
NVDA,
+4.88%

CEO Jensen Huang predicted in early 2017 that his company and Audi would be selling Level 4 cars by 2020, which has not happened. Two years later at the same trade show, Nvidia showed it had changed by instead focusing on ADAS, part of a noticeable trend in those years of tech and auto execs moving from talking about robotaxis and full autonomy in passenger cars to focusing on level 5 autonomy for dedicated robotaxi services and ADAS in cars that would be sold to consumers.

For more: Tech finds a middle lane for autonomous cars

That was just part of a reckoning for overselling — and overbuying, in some cases — the path to autonomous driving. The autonomous vehicle industry itself has been consolidating, with both Lyft Inc.
LYFT,
+2.88%

and Uber Technologies Inc.
UBER,
+0.24%

unloading their self-driving businesses for a fraction of what they had spent to develop them. In April, John Krafcik, the CEO of Waymo, Alphabet Inc.’s
GOOGL,
-0.25%

GOOG,
+0.38%

self-driving business, stepped down. Waymo has always been seen as the leader in self-driving, with the most hours logged of its research vehicles, but it is still far from becoming a commercial business.

Meanwhile, Musk, who recently inexplicably changed his title at Tesla to “Technoking,” has continued to make outlandish claims about the self-driving capabilities of Tesla’s electric vehicles, noting that full autonomy is just around the corner. In January, Musk was asked on the company’s earnings call why he was confident that Teslas will achieve full Level 5 autonomy in 2021, after claiming at the company’s 2019 “Autonomy Day” that Tesla cars would be fully functioning robotaxis by the end of this year.

See also: Autonomy Day shows that Elon Musk is just another car salesman

“We need to probably do a little bit more work to prove that Tesla Autopilot is capable of full self-driving, which I think will become obvious later this year,” he said.

Musk has said this year that “it wouldn’t be very difficult” to turn Tesla cars into an autonomous ride-hailing network to rival Uber and Lyft.

“I mean, if you’re an Uber or Lyft driver you can be managing a fleet of 10 cars. It sort of seems like a shepherd tending the flock type of thing,” Musk said in last October’s earnings call, when asked about giving customers the option of making money from their Teslas. “So I think that sort of, we could do that. It wouldn’t be very difficult, but we’re going to just be focused on just having an autonomous network that has sort of elements of Uber, Lyft and Airbnb.”

Liza Dixon, a doctoral candidate who is researching human-machine interaction in automated driving in Germany, has coined a term for this type of talk about autonomous capabilities: Autonowashing, the practice of making something appear more autonomous than it actually is.

“Claims overstating the capabilities of Autopilot/FSD have been an ongoing problem for some time now,” Dixon said in an email. She noted that there is a bias to reporting Tesla crashes, because it drives so much web traffic and social media discussion, but “this is not without reason, however, as Tesla continues autonowashing their driver-assistance systems.”

Dixon added that both Tesla Autopilot and full-self driving are both classified as Level 2, partial automation, according to the Society of Automotive Engineers, because they require constant human supervision and may require immediate intervention without warning.

“They are a suite of comfort features — which assist and support — but in no way replace the driver,” Dixon said. “Therefore, to say that any Level 2 system is ‘a better driver than a human’ or that it is ‘autonomous’ or ‘self-driving’ is simply false.

How Musk and Tesla can improve

It is not too late to put Musk’s autonowashing in reverse and start to act responsibly, and it may be doing so, with Bloomberg News reporting Monday that Tesla is now testing LIDAR in its cars after Musk previously said it was unnecessary for self-driving while most of the industry considers it a necessity as a safety step for sensing activity on the road. Musk’s bullhorn of a Twitter account is a good place to start, if he can stop making the prices of bitcoin
BTCUSD,
-0.37%

and dogecoin
DOGEUSD,
-0.31%

yo-yo instead.

Here are the specific things Musk and Tesla could do:

  • Call out misuse of the system when it occurs. Videos of people misusing Tesla’s Autopilot and FSD systems tend to go viral, and it could stop if either Musk publicly called out people doing it and asked them to stop such practices, or Tesla permanently revoked their access to Autopilot and FSD features as a result of their actions.

  • Add a driver-monitoring system. Cameras in the cabin will be a safety-critical system of all semiautonomous cars, because they require that drivers stay alert and are ready to take over operation of the automobile. Tesla is trying to cut costs by avoiding installing this system, though governments world-wide might soon require them, leading to potentially costly retrofits.

  • Stop calling the system “full self-driving” while it is not. It was bad enough that the ADAS system was labeled “Autopilot,” which gives the impression that drivers can turn it on and stop paying attention. But calling extra ADAS features “full self-driving” is needlessly confusing for a system that, again, Tesla has admitted to the California DMV only offers autonomy at level two (of five).

  • Stop proclaiming that Level 5 is around the corner and acknowledge publicly that Tesla is not close to full Level 5 autonomy and may never get there.

Tesla has disbanded its public-relations team, leaving this decision largely to Musk and Musk alone. An email sent to Tesla seeking comment for this column did not receive a response.

“Tesla is its own worst enemy,” said Bryan, the bond analyst who focuses on high-yield distressed companies. “It should be a national treasure. … But half the battle is because it is badly managed. We have a bad autocratic manager, who nobody says no to.”