Advertisement
Australia markets closed
  • ALL ORDS

    7,898.90
    +37.90 (+0.48%)
     
  • AUD/USD

    0.6447
    +0.0011 (+0.16%)
     
  • ASX 200

    7,642.10
    +36.50 (+0.48%)
     
  • OIL

    81.82
    -0.87 (-1.05%)
     
  • GOLD

    2,394.60
    +6.20 (+0.26%)
     
  • Bitcoin AUD

    95,513.19
    -2,508.81 (-2.56%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     

It’s time to notice Tesla’s Autopilot death toll

Tesla is proving something other automakers dare not attempt: New technology + foolish drivers = death.

Tesla (TSLA) and its audacious CEO, Elon Musk, deserve credit for revolutionizing electric vehicles and changing the paradigm of the stodgy auto industry. But Musk has sped ahead recklessly on another technology: the self-driving feature known as Autopilot, which has alarmed safety experts and contributed to an unprecedented pileup of deadly crashes.

In the latest sensational Tesla crash, a Model S sedan flew off a road near Houston on April 17, hit a tree, exploded and burned for hours. Rescue crews found two bodies inside. One was in the passenger seat and one was in the back seat. “There was no one in the driver’s seat,” a police official told news outlets.

With nobody at the wheel, the owner may have been showing off by letting Autopilot maneuver the car. Musk said two days after the crash that data recovered "so far" shows Autopilot was not engaged. But police say there's no way a driver could have moved from the front seat to the back after the crash occurred. Barring a suicide mission, Autopilot seems to be the only logical explanation.

ADVERTISEMENT

Since Tesla introduced Autopilot in 2015, there have been at least 11 deaths in 9 crashes in the United States that involved Autopilot. Internationally, there have been at least another 9 deaths in 7 additional crashes. Virtually all automakers are developing technology similar to Autopilot, but there are no known deaths involving self-driving technology in any other make available to consumers. There was one death in a 2018 accident involving a Volvo vehicle in Arizona that was part of an Uber test program.

FUERSTENWALDE, GERMANY - SEPTEMBER 03: Tesla head Elon Musk arrives to have a look at the construction site of the new Tesla Gigafactory near Berlin on September 03, 2020 near Gruenheide, Germany. Musk is currently in Germany where he met with vaccine maker CureVac on Tuesday, with which Tesla has a cooperation to build devices for producing RNA vaccines, as well as German Economy Minister Peter Altmaier yesterday. (Photo by Maja Hitij/Getty Images)
Tesla head Elon Musk arrives to have a look at the construction site of the new Tesla Gigafactory near Berlin on September 03, 2020 near Gruenheide, Germany. (Photo by Maja Hitij/Getty Images) (Maja Hitij via Getty Images)

The National Transportation Safety Board has implicated Autopilot in several deadly crashes and in others where nobody died. The NTSB reports sound a common theme: Drivers over-relied on a self-driving system that in some cases was flawed and in others was simply not as capable as the driver thought. Examples from some of those reports:

A 2016 crash in Williston, Fla., that killed the driver, who drove his Tesla Model S under a tractor-trailer: “The Tesla’s automated vehicle control system was not designed to, and did not, identify the truck crossing the car’s path or recognize the impending crash.”

A 2018 crash in Mountain View, Calif., in which a Model X hit a highway divider, killing the driver: “The probable cause of the crash was the Tesla Autopilot system steering the sport utility vehicle into a highway gore area due to system limitations, and the driver’s lack of response due to distraction likely from a cell phone game application and overreliance on the Autopilot partial driving automation system.”

A non-fatal 2018 crash in Culver City, Calif., in which a Model S rear-ended a firetruck parked in an HOV lane to tend to a collision on the other side of the freeway: “The probable cause was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention on the vehicle’s advanced driver assistance system.”

A 2019 crash in Delray Beach, Fla., in which a Model 3 drove under a tractor-trailer, killing the driver. “The Autopilot system did not send a visual or audible warning to the driver to put his hands back on the steering wheel. The collision avoidance systems did not warn or initiate [auto-emergency braking] due to the system’s design limitations. The environment was outside the [operational design domain] of the Autopilot system, and Tesla does not limit Autopilot operation to the conditions for which it is designed.”

This photo provided by the Laguna Beach Police Department shows a Tesla sedan, left, in autopilot mode that crashed into a parked police cruiser Tuesday, May 29, 2018, in Laguna Beach, Calif. Police Sgt. Jim Cota says the officer was not in the cruiser at the time of the crash and that the Tesla driver suffered minor injuries. (Laguna Beach Police Department via AP)

On February 1, Robert Sumwalt, chairman of the NTSB, wrote to the Department of Transportation criticizing lax safety standards for self-driving systems, citing Tesla as a particular concern. The 2019 Florida fatality, Sumwalt said, might not have happened if the department had required Tesla to improve Autopilot’s safety protocols following the 2016 Florida fatality. “Tesla is testing on public roads a highly automated [autonomous vehicle] technology but with limited oversight or reporting requirements,” Sumwalt wrote. He warned this “poses a potential risk to motorists and other road users.” His memo mentioned the Uber incident in Arizona but didn’t refer to incidents involving any other automaker.

Tesla promotes Autopilot as if it’s the world’s most advanced self-driving system. It’s not. There are 6 levels of autonomous driving, from 0 to 5. Autopilot is a Level 2 system, which the Department of Transportation defines as “partial automation” that requires the driver to “remain engaged with the driving task and monitor the environment at all times.” This year, Tesla is rolling out what it calls a “full-self driving subscription,” available as a $10,000 software upgrade on most models. But the technology falls well short of Level 5 autonomy, or “full automation.”

Jason Levine, executive director of the nonprofit Center for Auto Safety, calls Autopilot “an intentionally deceptive name being used for a set of features that are essentially an advanced cruise control system.” He told Yahoo Finance that “Tesla’s marketing is leading consumers to foreseeably misuse the technology in a dangerous way. Users are led to believe the vehicle can navigate any roadway. The rising body count suggests Autopilot is not in fact a replacement for a driver.”

BRUSSELS, BELGIUM - JANUARY 9: Interior on a Tesla Model X full electric luxury crossover SUV car with a large touch screen and carbon look dashboard on display at Brussels Expo on January 9, 2020 in Brussels, Belgium. The Model X uses falcon wing doors for access to the second and third row seats. (Photo by Sjoerd van der Wal/Getty Images)
Interior on a Tesla Model X full electric luxury crossover SUV car with a large touch screen and carbon look dashboard on display at Brussels Expo on January 9, 2020 in Brussels, Belgium. (Photo by Sjoerd van der Wal/Getty Images) (Sjoerd van der Wal via Getty Images)

On the day of the April 17 crash, Musk tweeted a link to Tesla safety data that claims cars driven with Autopilot are safer than those driven without it. What Musk can’t demonstrate, however, is how much safer Autopilot would be if the safety protocols were stricter and the company promoted it less aggressively. The General Motors Supercruise system, for instance, uses a camera to assure the driver’s eyes stay focused on the road when the system is on. An alert sounds if the driver looks down at a smartphone or falls asleep or gazes out the window, and the system will gradually disable itself and pull the car over if the driver doesn’t respond. Unlike Autopilot, which operates on any road, Supercruise only works on roads GM has determined fit the safety parameters of the system. Most other automakers are building similar safeguards into their driverless systems.

Nearly 40,000 people die in car crashes in the United States every year, so a few attributed to the overaggressive marketing department of one automaker may seem statistically insignificant. Musk is a stubborn libertarian who violated government shutdown orders during the 2020 coronavirus pandemic to keep Tesla’s Fremont, Calif., factory open, risking arrest. He may feel Tesla owners have the right to take the bait and abuse Autopilot and risk death, and there’s probably some fine-print legalese providing Tesla a measure of liability protection when they do.

Other automakers have learned that a cavalier Muskian attitude toward customer safety can wreck the company’s image and cause untold regulatory and legal problems, whether it’s from unintended acceleration, exploding airbags, faulty ignition switches or separating tires. Musk and Tesla have built the aura of a company that skirts conventional rules and gets away with it. So maybe there will be no class-action lawsuits or congressional hearings or "60 Minutes" exposes into unnecessary Autopilot deaths. Unless something changes, however, there will probably be more dead Tesla drivers misusing technology they don't understand.

Editor's note: This story was updated on April 20 to include new information about the April 17 crash in Texas.

Rick Newman is the author of four books, including "Rebounders: How Winners Pivot from Setback to Success.” Follow him on Twitter: @rickjnewman. You can also send confidential tips, and click here to get Rick’s stories by email.

Read more:

Get the latest financial and business news from Yahoo Finance