[
Humans, by nature, are flawed. Put them in a two-ton vehicle, and all sorts of problems can occur. They can crash into a fixed object going straight on. They back into fixed objects when putting the car in reverse. They hit trees and poles, buses and trucks. Simply put, humans lack the quick reaction speeds and fine motor skills needed to operate a car safely.
Except there’s one problem: all of the above happened to Tesla’s autopilot robotaxis. All in one city, and all in the span of less than a month. And according to Tesla’s own published findings, the company’s autopilot is four times worse than that of a human.
A recent “vehicle safety report” from Tesla shows its autonomous robotaxis got into five additional crashes in Austin, including a crash with a bus while the Tesla was stationary. A Tesla backed into fixed objects twice: in one, backing up into an object going 2mph and in another at 1mph. Another instance involved a collision with a heavy truck at 4mph, but perhaps most damning was crashing into a fixed object going 17mph while driving straight.
This is most of what we know, as unlike its other autonomous vehicle competitors Waymo and Zoox—and every other company in the market—Tesla is the only company to fully redact and hide details of all crashes from the public, thanks to a confidentiality provision under the NHTSA. Tesla even updated a report for a July 2025 crash—a right turn crash into an SUV while going 2mph—to mention that a victim was hospitalized, Electrek reported. The report initially listed “property damage only.”
Still, the disclosures about the crashes underscore what Tesla has put forward for years regarding its autopilot program: humans are in fact safer. Actually, after this month’s events, they are four times safer, according to Tesla’s own metrics.
If we take the National Highway Traffic Safety Administration’s standards, Tesla’s autopilot is even worse. Here’s what else we know about Tesla’s latest autopilot safety record—or lack thereof.
The Model Y crash record
The five crashes occurred between Dec. 2025 and Jan. 2026, and all involved Model Y vehicles with autonomous driving systems engaged. That comprises more than a third of all Tesla robotaxi crashes that have occurred in Austin since the company expanded service to the city last June.
The fleet reached about 700,000 paid miles through November, and is estimated to have surpassed 800,000 by mid-January. With 14 crashes since the service area launched, that equates to about one crash every 57,000 miles.
Tesla’s Vehicle Safety Reports states that the average American driver (of a Tesla) is involved in a minor collision every 229,000 miles, meaning in the estimated 800,000 miles Tesla’s vehicles drove in Austin, the average human would have been involved in roughly four crashes, compared to the 14 involving Tesla robotaxis.
The NHTSA guidelines estimate that the average American crashes their car every 500,000 miles—or 1.6 times, per the 800,000 miles driven by the company’s robotaxis in Austin. Tesla’s fleet crashed at an average of eight times the human rate.
Tesla’s Car Troubles
Dan O’Dowd, founder of The Dawn Project, a software company watchdog, slammed for hiding the report, saying that Tesla was “terrified of the public learning how defective its software is.” Companies are required to report all crashes to the NHTSA regarding their autonomous driving and advanced driving systems within five days. But under the NHTSA’s confidentiality provisions, Tesla can merely report the crashes occurred and leave out any narrative details, instead saying the details include “confidential business information.”
Tesla has not responded to requests for comment.
In contrast, Waymo’s record, logging over 127 million miles of autonomous driving, shows a different story, and does fit the narrative of autonomous driving is safer than that of a human’s. The fleet reduced injury-causing crashes by 80% and serious-injury crashes by 91%. The company’s autonomous vehicles have logged 6.34 million miles though September 2025 in Austin.
Tesla’s overall driving record is much better—according to their Q3 2025 report released in October, the company reported humans, riding shotgun in cars that were fully engaged in the autopilot technology, were involved in one crash for every 6.36 million miles. The company compared this to stats from the NHTSA on non-fully self-driving cars. The second quarter of last year was their best quarter to date, recording one crash for ever 6.69 million miles for drivers using the autopilot technology. Drivers of Teslas without autopilot still recorded crashes at a fewer rate than the numbers from the NHTSA, at about one crash for every 963,000 miles.
The car manufacturer has also come under fire for safety concerns surrounding its Cybertruck model, which has brought a whole slew of design flaws and safety malfunctions to surface. A Youtuber, looking to prove how unsafe the Cybertruck’s front trunk, dubbed a frunk, stuck his finger within the gap and left with a nasty bruise on his digit—and a dent on the trunk. He also stuck a carrot and banana and the frunk sliced them cleanly.
The Cybertruck boasts a $61,000 “armor glass” window that was quickly smashed by the company’s chief designer. The parents of a 19-year-old are suing the car manufacturer after their daughter died in a Cybertruck, alleging the car’s door handles made it impossible to escape alive. The NHTSA is also looking into Tesla’s Model 3 sedans and their “hidden” door handles as a major safety concern. The Model Y is also under investigation for its door handles after the NHTSA received reports of people being trapped in cars due to low battery voltage. Although there are manual handles inside the vehicle, children trapped inside may not be able to reach them—a growing concern after several people reported they were trapped in their cars while it engulfed in flames. And just last year, a jury found Tesla to be partly responsible for a crash involving its Autopilot technology and ordered the company to pay more than $240 million in damages.
Elon Musk doesn’t keep his thoughts to himself regarding Tesla battling it design flaws in court. After the NHTSA announced a recall of the automaker’s autonomous vehicles in 2023, the billionaire founder took to X to voice the word “recall” was “anachronistic and just flat wrong!”
https://fortune.com/img-assets/wp-content/uploads/2026/02/GettyImages-492682160-e1772119242432.jpg?resize=1200,600
https://fortune.com/2026/02/26/tesla-robotaxis-4x-8x-worse-than-humans-at-driving-safety-record-crashes/
Catherina Gioino




