Tesla just launched a new safety reporting hub for its Full Self-Driving technology, claiming FSD drivers go 5.1 million miles between major crashes versus 699,000 miles for average drivers. But safety experts are calling the data misleading, saying Tesla's history of deceptive reporting makes these numbers hard to trust despite the company's attempt at transparency.
Tesla thinks new data will solve its credibility problem. The company just dropped a dedicated safety hub for Full Self-Driving technology, complete with a live mile counter showing 6.47 billion FSD miles driven and impressive crash statistics that put human drivers to shame.
The numbers look compelling on paper. Tesla claims FSD drivers travel 5.1 million miles before a major collision and 1.5 million miles before minor ones - dramatically better than average US drivers who crash every 699,000 and 229,000 miles respectively. It's the kind of data that should silence critics and boost confidence in autonomous driving.
But safety researchers aren't buying it. "Yeah on the surface it looks like FSD is performing fairly well," Noah Goodall, a civil engineer who's published peer-reviewed Tesla studies, told The Verge. "But I put very little faith in these numbers because of Tesla's past deceptions."
The skepticism runs deeper than typical academic caution. Tesla's quarterly safety reports have been criticized for years for cherry-picking data and ignoring basic traffic statistics. The company used to lump together Autopilot highway miles - where crashes are naturally less common - with overall accident rates, creating misleading comparisons.
This new hub does address some concerns. For the first time, Tesla separates highway from non-highway miles, acknowledging that city driving carries higher crash risks. Philip Koopman, an autonomous vehicle expert at Carnegie Mellon, calls it "a good start" but tears apart the methodology on his Substack.
Koopman's critique gets to the heart of statistical manipulation. Tesla essentially compares brand-new vehicles packed with safety tech to the entire US fleet, including aging cars without modern features. It's like claiming a private school produces superior athletes by comparing its students to the general population, including elderly and disabled people.
The data gaps remain glaring. excludes injury and fatality information, claiming privacy laws and inconsistent reporting make it impossible to track. But Koopman points out the company could easily count incoming wrongful death lawsuits - - to estimate fatality rates.












