Tesla achieved a significant win on Tuesday in the first U.S. trial concerning claims that its Autopilot driver assistance system caused a death. This is significant because the carmaker is currently facing numerous lawsuits and federal probes pertaining to the same technology. With this ruling, Tesla has achieved its second significant victory of the year, as juries have refused to conclude that the company’s software was flawed. The more sophisticated Full Self-Driving (FSD) technology, which Tesla CEO Elon Musk has hailed as essential to the company’s future but has come under regulatory and legal scrutiny, has been tested and implemented. The verdict in the civil case demonstrates the growing popularity of Tesla’s claims that drivers bear the final say when something goes wrong on the road. The civil lawsuit filed in Riverside County Superior Court alleged the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour (105 km per hour), strike a palm tree and burst into flames, all in the span of seconds. The 2019 crash killed Lee and seriously injured his two passengers, including a then-8-year-old boy who was disemboweled, court documents show. The trial involved gruesome testimony about the passengers’ injuries, and the plaintiffs asked the jury for $400 million plus punitive damages. Tesla denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also argued it was unclear whether Autopilot was engaged at the time of the crash. The 12-member jury announced they found the vehicle did not have a manufacturing defect. The verdict came on the fourth day of deliberations, and the vote was 9-3. Jonathan Michaels, an attorney for the plaintiffs, expressed disappointment in the verdict but said in a statement that Tesla was “pushed to its limits” during the trial. “The jury’s prolonged deliberation suggests that the verdict still casts a shadow of uncertainty,” he said. Tesla said its cars are well designed and make the roads safer. “The jury’s conclusion was the right one,” the company said in a statement. Tesla won an earlier trial in Los Angeles in April with a strategy of saying it tells drivers that its technology requires human monitoring, despite the “Autopilot” and “Full Self-Driving” names. That case was about an accident where a Model S swerved into the curb and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and driver distraction was to blame. Bryant Walker Smith, a University of South Carolina law professor, said the outcome in both cases show “our juries are still really focused on the idea of a human in the driver’s seat being where the buck stops.” At the same time, the Riverside case had unique steering issues, said Matthew Wansley, a former general counsel of nuTonomy, an automated driving startup, and associate professor at Cardozo School of Law. In other lawsuits, plaintiffs have alleged Autopilot is defectively designed, leading drivers to misuse the system. The jury in Riverside, however, was only asked to evaluate whether a manufacturing defect impacted the steering. “If I were a juror, I would find this confusing,” Wansley said. Tesla shares closed up 1.76% after rising more than 2%. During the Riverside trial, an attorney for the plaintiffs showed jurors a 2017 internal Tesla safety analysis identifying “incorrect steering command” as a defect, involving an “excessive” steering wheel angle. A Tesla lawyer said the safety analysis did not identify a defect, but rather was intended to help the company address any issue that could theoretically arise with the vehicle. The automaker subsequently engineered a system that prevents Autopilot from executing the turn which caused the crash. On the stand, Tesla engineer Eloy Rubio Blanco rejected a plaintiff lawyer’s suggestion that the company named its driver-assistant feature “Full Self-Driving” because it wanted people to believe that its systems had more abilities than was really the case. “Do I think our drivers think that our vehicles are autonomous? No,” Rubio said, according to a trial transcript seen by Reuters. Tesla is facing a criminal probe by the U.S. Department of Justice over claims its vehicles can drive themselves. In addition, the National Highway Traffic Safety Administration has been investigating the performance of Autopilot after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles. Guidehouse Insights analyst Sam Abuelsamid said Tesla’s disclaimers give the company powerful defenses in a civil case. “I think that anyone is going to have a hard time beating Tesla in court on a liability claim,” he said. “This is something that needs to be addressed by regulators.” (With inputs from agency)
What's the point in buying self-driving Tesla? Musk's EV giant pins brunt of crash on driver
What's the point in buying self-driving Tesla? Musk's EV giant pins brunt of crash on driver
FP STAFF
• November 1, 2023, 1:43 PM IST
Tesla denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also argued it was unclear whether Autopilot was engaged at the time of the crash
Advertisement
)
Written by abhishek awasthi
Tags:
Tesla
Find us on YouTube


)