Australia markets closed
  • ALL ORDS

    7,771.80
    +43.30 (+0.56%)
     
  • ASX 200

    7,558.10
    +46.50 (+0.62%)
     
  • AUD/USD

    0.6924
    -0.0157 (-2.22%)
     
  • OIL

    73.23
    -2.65 (-3.49%)
     
  • GOLD

    1,865.90
    -50.40 (-2.63%)
     
  • BTC-AUD

    33,750.47
    -222.54 (-0.66%)
     
  • CMC Crypto 200

    535.42
    -1.43 (-0.27%)
     
  • AUD/EUR

    0.6411
    -0.0074 (-1.14%)
     
  • AUD/NZD

    1.0935
    +0.0005 (+0.05%)
     
  • NZX 50

    12,197.15
    +44.99 (+0.37%)
     
  • NASDAQ

    12,573.36
    -229.78 (-1.79%)
     
  • FTSE

    7,901.80
    +81.64 (+1.04%)
     
  • Dow Jones

    33,926.01
    -127.93 (-0.38%)
     
  • DAX

    15,476.43
    -32.76 (-0.21%)
     
  • Hang Seng

    21,660.47
    -297.89 (-1.36%)
     
  • NIKKEI 225

    27,509.46
    +107.41 (+0.39%)
     

Tesla engineer testifies that 2016 video promoting self-driving was faked

Tesla faked a 2016 video promoting its self-driving technology, according to testimony by a senior engineer reviewed by Reuters.

The video, which shows a Tesla Model X driving on urban, suburban and highway streets; stopping itself at a red light; and accelerating at a green light is still on Tesla's website and carries the tagline: "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself."

CEO Elon Musk used the video as evidence that Tesla "drives itself" by relying on its many built-in sensors and self-driving software. Yet according to Ashok Elluswamy, director of Autopilot software at Tesla, the video was staged using 3D mapping on a predetermined route, a feature that is not available to consumers.

In his July deposition, which was taken as evidence in a lawsuit against Tesla for a fatal 2018 crash involving former Apple engineer Walter Huang, Elluswamy said Musk wanted the Autopilot team to record "a demonstration of the system's capabilities."

Elluswamy's statement also confirms and provides more details on what anonymous former employees told the New York Times in 2021. While there appeared to be no legal ramifications for Tesla following the NYT's investigation, an on-the-record testimony from a current employee could cause trouble for the automaker, which is already beleaguered by lawsuits and investigations surrounding its Autopilot and Full Self-Driving (FSD) systems. (To be clear, neither system is actually self-driving. They are advanced driving-assistance systems that automate certain driving tasks, but as Tesla has made clear on its website, drivers should stay alert and keep their hands on the steering wheel when the systems are engaged.)

When electric truck maker Nikola was accused of and eventually admitted to faking a video of its fuel cell–powered Nikola One semitruck prototype -- Nikola had actually placed the truck on a small hill, allowing gravity, not the motor, to do its thing -- state and federal investigations were launched into both Nikola and its chairman and founder, Trevor Milton. Milton was found guilty on charges of securities fraud in October.

Tesla's fake video was created using 3D mapping on a predetermined route from a house in Menlo Park, California, to Tesla's office in Palo Alto, according to Elluswamy. Drivers had to intervene to take control during test runs, and the scenes that were left on the cutting room floor included the test car crashing into a fence in Tesla's parking lot when trying to park itself without a driver.

“The intent of the video was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system,” Elluswamy said, according to a transcript of his testimony seen by Reuters.

Musk promoted the video at the time, tweeting Tesla vehicles require "no human input at all" to drive through urban streets to highways and eventually to find a parking spot.

Neither Musk nor Tesla, which has disbanded its press office, responded in time to TechCrunch's request for comment.

The revelation comes at a time when Tesla is facing litigation for multiple fatal crashes involving its Autopilot system, as well as a criminal investigation from the U.S. Department of Justice for claims Tesla made about Autopilot. Just this week, a Tesla that had FSD engaged suddenly accelerated and crashed into a BC Ferries ramp in Canada, totaling the vehicle.

Regarding the 2018 crash that killed Huang, the National Transportation Safety Board concluded in 2020 Tesla's "ineffective monitoring of driver engagement" had contributed to the crash, which the board said was likely caused by Huang's distraction and the limitations of the system.

While Tesla does tell its drivers to pay attention to the road, there are ways drivers can fool the system to make the car believe they were paying attention, said Elluswamy. Many drivers even go so far as to buy Tesla counterweights on websites like Alibaba, which can be placed on steering wheels to mimic the weight of human hands that are otherwise engaged while the car is in movement.

Even amid regulator scrutiny and reports of crashes, Tesla recently extended access of its FSD software to customers across North America.