Jury finds Tesla partially liable in deadly Autopilot crash

0
Jury finds Tesla partially liable in deadly Autopilot crash

A federal jury in Florida found Tesla partially liable for a 2019 crash involving its Autopilot system, delivering a major legal setback for the company. On Friday, Aug. 1, jurors ordered Tesla to pay part of the $329 million in damages to the family of a woman who died in the crash and to another person who survived.

The verdict marks one of the first times a jury has held Tesla legally responsible for a crash involving its Autopilot system.

Six years ago in Florida, a Tesla Model S driver using Autopilot crashed into a parked SUV. Investigators said the driver looked down to pick up his phone, took his eyes off the road and hit the vehicle, striking two people who were standing nearby. Naibel Benavides Leon, 22, was killed, and her boyfriend, Dillon Angulo, was injured. 

Claims against Tesla and the driver

Angulo and Benavides Leon’s family sued Tesla and the driver for allowing its drivers to over-rely on their Autopilot technology. After a three-week trial in Miami, a federal jury found Tesla partially responsible for the 2019 crash. Jurors said the driver or the Autopilot software failed to respond and brake in time. The driver was assigned two-thirds of the blame, while Tesla was held one-third liable.

Following the verdict, Tesla pushed back, calling the jury’s decision “wrong” and warning that it could hinder progress on vehicle safety and the development of life-saving technology. The company said it plans to appeal, citing “substantial errors of law and irregularities at trial.” 

Despite the jury assigning most of the blame to the driver, Tesla maintains he was “solely at fault” for the 2019 crash because he was speeding, had his foot on the accelerator, overriding Autopilot, and was distracted while reaching for his phone. 

“This was never about Autopilot,” Tesla said, calling the case “a fiction concocted by plaintiffs’ lawyers blaming the car when the driver — from day one — admitted and accepted responsibility.”

Because Tesla was found legally liable for its share of fault, the company is responsible for paying $243 million in damages. That amount includes both compensatory and punitive damages awarded to the victims and their families.

What Autopilot does — and what it doesn’t

During the trial, the Tesla driver described Autopilot as a “copilot” and said he relied on it to step in if he made a mistake. He testified that the system failed to alert him to the parked SUV or the people nearby and did not brake before the crash.

Tesla’s attorneys argued that the crash was caused entirely by the driver, not the vehicle or its software. In opening statements, attorney Joel Smith said the case wasn’t about Autopilot, but about a distracted and aggressive driver who was looking for his phone and failed to stop before the collision.

Tesla describes Autopilot on its website as a driver-assistance system meant to make driving safer and less stressful, but not fully autonomous. It includes features like Traffic-Aware Cruise Control, which keeps a set speed and distance from the car ahead, and Autosteer, which helps the vehicle stay in its lane.

While Autopilot comes standard on new Teslas, the company says drivers must keep their hands on the wheel and be ready to take control at any moment. Tesla warns that failing to stay alert can lead to serious consequences.

Federal investigators saw a pattern of misuse

The National Highway Traffic Safety Administration said Tesla’s Autopilot combines two features: adaptive cruise control, which keeps a set speed and distance from other cars, and Autosteer, which helps the vehicle stay in its lane. But while Autopilot can steer, brake and accelerate, it’s still a driver-assist tool rather than a self-driving system.

Federal investigators looked at 956 crashes involving Teslas through mid-2023. In about half of those, Autopilot wasn’t in use or the crash wasn’t related. However, patterns emerged in hundreds of other cases, like Teslas hitting objects or veering off roads when drivers were distracted or when Autopilot struggled on slick surfaces.

NHTSA found Tesla’s system gave drivers a false sense of confidence, without enough safeguards to make sure they stayed alert. That mismatch between what the system can actually do and what drivers think it can do led to crashes, including at least 13 fatal ones.

Late last year, Tesla issued a recall for all vehicles with Autopilot, admitting its warnings and controls weren’t strong enough to prevent misuse.

Ella Rae Greene, Editor In Chief

Leave a Reply

Your email address will not be published. Required fields are marked *