My Blog

My WordPress Blog

Auto

Widow sues Tesla over crash death ‘linked’ to Autopilot system

The widow of a man who died after his Tesla veered off the road and crashed into a tree while he was using its partially automated driving system is suing the carmaker, claiming its marketing of the technology is dangerously misleading.

Representative Image:The interior of a Tesla Model S is shown in autopilot mode in San Francisco, California, U.S., April 7, 2016.(REUTERS)

The Autopilot system prevented Hans Von Ohain from being able to keep his Model 3 Tesla on a Colorado road in 2022, according to the lawsuit filed by Nora Bass in state court on May 3. Von Ohain died after the car hit a tree and burst into flames, but a passenger was able to escape, the suit says.

Unlock exclusive access to the latest news on India’s general elections, only on the HT App. Download Now! Download Now!

Von Ohain was intoxicated at the time of the crash, according to a Colorado State Patrol report.

The Associated Press sent an email to Tesla’s communications department seeking comment Friday.

Tesla offers two partially automated systems, Autopilot and a more sophisticated “Full Self Driving,” but the company says neither can drive itself, despite their names.

ALSO READ- OpenAI could announce AI-powered Google search competitor on this date: What to expect?

The lawsuit, which was also filed on behalf of the only child of Von Ohain and Bass, alleges that Tesla, facing financial pressures, released its Autopilot system before it was ready to be used in the real world. It also claims the company has had a “reckless disregard for consumer safety and truth,” citing a 2016 promotional video.

“By showcasing a Tesla vehicle navigating traffic without any hands on the steering wheel, Tesla irresponsibly misled consumers into believing that their vehicles possessed capabilities far beyond reality,” it said of the video.

Last month, Tesla paid an undisclosed amount of money to settle a separate lawsuit that made similar claims, brought by the family of a Silicon Valley engineer who died in a 2018 crash while using Autopilot. Walter Huang’s Model X veered out of its lane and began to accelerate before barreling into a concrete barrier located at an intersection on a busy highway in Mountain View, California.

ALSO READ- Prompt response: A Wknd interview with Madhumita Murgia, author of Code Dependent

Evidence indicated that Huang was playing a video game on his iPhone when he crashed into the barrier on March 23, 2018. But his family claimed Autopilot was promoted in a way that caused vehicle owners to believe they didn’t have to remain vigilant while they were behind the wheel.

U.S. auto safety regulators pressured Tesla into recalling more than 2 million vehicles in December to fix a defective system that’s supposed to make sure drivers pay attention when using Autopilot.

ALSO READ- What risks do advanced AI models pose in the wrong hands?

In a letter to Tesla posted on the agency’s website this week, U.S. National Highway Traffic Safety Administration investigators wrote that they could not find any difference in the warning software issued after the recall and the software that existed before it. The agency says Tesla has reported 20 more crashes involving Autopilot since the recall.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *