Tesla is scrambling to overturn a landmark $243 million wrongful death verdict that found its Autopilot software partially responsible for killing 22-year-old Naibel Benavides in 2019. The electric vehicle giant filed a motion Friday requesting the court invalidate the jury's decision or order a new trial, marking the company's most aggressive legal pushback against autonomous driving liability claims.
Tesla just threw its heaviest legal punch yet in the fight over autonomous driving liability. The company's lawyers filed an emergency motion Friday demanding a federal court throw out the $243 million wrongful death verdict that stunned the auto industry earlier this month, according to court documents obtained by The Verge. The unprecedented judgment found Tesla's Autopilot software partially responsible for the death of Naibel Benavides, who was killed when a Model S driver plowed into her and boyfriend Dillon Angulo in 2019. It marked the first time a jury held Tesla liable for an Autopilot-related fatality, sending shockwaves through an industry that's spent years deflecting responsibility for autonomous driving failures. Now Tesla is firing back with everything it has, claiming the verdict "flies in the face of basic Florida tort law, the Due Process Clause, and common sense." The company's Gibson Dunn legal team argues that driver Stephen McGee bore sole responsibility for the crash, noting he pressed the accelerator to override Autopilot in the seconds before impact. "Auto manufacturers do not insure the world against harms caused by reckless drivers," the 43-page motion states. The legal strategy reveals Tesla's deepest fears about autonomous driving liability. The company spent years successfully arguing that drivers remain responsible when crashes occur, even with Autopilot engaged. This verdict threatens to shatter that legal shield across the industry. Tesla's counterattack targets two key evidence points that swayed the jury. First, they claim CEO Elon Musk's public statements about vehicle autonomy should never have been admitted as evidence. Musk has repeatedly promised full self-driving capabilities that Tesla vehicles don't actually possess, creating what legal experts call a "marketing versus reality" gap that proved damaging in court. The motion also disputes allegations that covered up crash data, calling claims about withholding camera footage from police investigators "false" and designed to "inflame" the jury. These data transparency issues have become a recurring theme in crash investigations, with regulators and families demanding fuller disclosure of Autopilot's decision-making process. The stakes extend far beyond balance sheet. Every major automaker is watching this case as they roll out their own semi-autonomous systems. Super Cruise, BlueCruise, and emerging players like all face similar liability questions if courts start holding manufacturers responsible for AI-driven decisions. Legal experts say the verdict could reshape how autonomous driving companies market their technology. "This fundamentally changes the calculus," says Professor Ryan Calo, who studies robotics law at the University of Washington. "Companies can no longer assume they'll be shielded from liability as long as a human is theoretically in control." The financial implications are staggering. faces hundreds of similar Autopilot-related lawsuits, and a $243 million baseline could expose the company to billions in potential damages. The stock market has largely shrugged off the news so far, but legal observers expect that to change if appeals fail. motion requests either complete dismissal of the verdict or a new trial with stricter evidence rules. The company argues the original trial was tainted by inflammatory evidence and jury instructions that improperly assigned blame to the manufacturer rather than the driver. This legal battle arrives at a crucial moment for autonomous ambitions. The company is preparing to launch its robotaxi service and has promised fully autonomous vehicles by 2026. But regulatory scrutiny is intensifying, with the investigating dozens of Autopilot crashes and considering new safety requirements.