Tesla knew Autopilot weakness killed a driver – and didn't fix it, engineers claim

Software's alleged inability to handle cross traffic central to court battle after two road deaths

Tesla's Autopilot engineers have claimed the automaker's leadership not only knew the software was unable to detect and respond to cross traffic, it did nothing to fix it.

These allegations came to light this week from a civil lawsuit brought against Tesla regarding a crash that killed 50-year-old father Jeremy Banner in 2019. He died when his Tesla Model 3 smashed into a tractor-trailer in cross traffic; Autopilot had been activated by Banner 10 seconds prior to the collision. Neither the driving-assistance software nor Banner, it would seem, saw and reacted to the other vehicle in time.

If his fate sounds familiar, that's because it bears remarkable similarity to a 2016 accident that killed Joshua Brown, whose Tesla Model S with Autopilot activated failed to notice an 18-wheeler tractor-trailer traveling crossing a highway. As happened years later in Banner's case, Brown's Tesla passed under the trailer, ripping the top off of the vehicle and killing the 40-year-old.

Banner's family, who sued Tesla shortly after his death, alleged in a court filing [PDF] last week the automaker knew about Autopilot's inability to handle cross traffic after Brown's death, and failed to do anything to fix it, leading to Banner's fatal crash. It was argued Tesla should have learned from that 2016 tragedy, and either improved Autopilot to safely handle cross traffic or made it disengage in those situations, which might have saved Banner's life.

Specifically, Tesla was aware Autopilot "was not fully tested for safety and was not designed to be used on roadways with cross-traffic or intersecting vehicles ... Nevertheless, Tesla programmed Autopilot to allow it to be used on roadways that Tesla knew were not suitable for its use and knew would result in fatal accidents," the Banner family alleged.

The engineers speak - under oath

Testimonies of two Autopilot engineers - Chris Payne and Nicklas Gustafsson - are critical to the Banner family's case. Both were deposed in 2021, and their statements were included in a motion by the family to amend their earlier complaint to add a claim for punitive damages.

According to Gustafsson's statements in the above filing, Autopilot was released without the ability to detect cross traffic, something he had difficulty justifying when asked why Tesla had decided to omit cross-traffic detection. 

Payne also said, according to the filing, Autopilot was only designed to be used on highways with center dividers because "it was technically a 'very hard thing' for the hardware and software to account for cross traffic." 

We're told that Payne explained Autopilot is designed to detect the presence of a central divider, and is smart enough to deactivate itself if no center divider is detected. Nonetheless, "you can engage and operate Autopilot if there is not a center divider and it will continue to operate," Payne said in his deposition.

Gustafsson said much the same as Payne during his deposition, namely that Autopilot wasn't designed to respond to cross traffic. He also said he investigated Brown's 2016 death as part of his job, and alleged that despite Tesla thus knowing about the issue "no changes were made to Autopilot's systems to account for cross traffic."  

Tesla didn't respond to questions from The Register

More court cases, more problems

At the heart of the Banner case is the allegation that Tesla misrepresented the capabilities of Autopilot, which it "knew to be defective based on a prior fatal accident," and still made an "intentional decision to continue profiting billions from the sales of their defective vehicles." 

That line could end up being a headache for Tesla, which is being investigated by the US National Highway Traffic Safety Administration (NHTSA) regarding various accidents the regulator has a suspicion may have been caused by drivers being misled by Tesla's marketing hype. That is to say, drivers may have been given the impression Autopilot – which is a super-cruise-control feature rather than a truly fully autonomous driving solution – was more capable than it actually was, resulting in motorists putting themselves in dangerous situations.

Additionally, the US Department of Justice is probing Autopilot's hype versus its safety record, too.

The NHTSA has already made Tesla issue one Autopilot recall and software patch, and allegations Tesla was aware of Autopilot's shortcomings but did nothing - from the biz's own techies, no less - will likely be of interest to investigators. In a separate case earlier this year, it was claimed Tesla staged a self-driving demo.

Whether lawyers for Banner's family will be more successful in their fight than another Tesla owner who sued the company over a non-fatal Autopilot accident will be determined in October, when the case is scheduled to begin its jury trial. ®

More about

TIP US OFF

Send us news


Other stories you might like