Group Calls for Tesla Full Self-Driving Software Ban After Failed Safety Tests

The Dawn Group said the latest version of the software did not prevent a Tesla from striking a child-sized mannequin while illegally passing a stopped school bus.

Tesla-FSD-ban-school-bus-safety-test

Public safety advocacy group The Dawn Project is urging lawmakers to ban Tesla’s Full Self-Driving (FSD) software from public roads after a series of safety tests showed the technology failed to stop for school buses and struck child-sized mannequins in every trial.

The report, shared with members of Congress and federal regulators including the National Highway Traffic Safety Administration (NHTSA), highlights what The Dawn Project describes as “critical safety defects” in Tesla’s autonomous driving systems. The tests come amid Tesla’s rollout of its Robotaxi fleet in Austin, TX, an initiative that has already drawn scrutiny for incidents of vehicles driving on the wrong side of the road, blocking intersections and nearly colliding with other vehicles.

According to the report, The Dawn Project ran a live demonstration in Austin to replicate a 2022 North Carolina crash in which a child was struck by a Tesla operating in self-driving mode after it passed a stopped school bus. In the recreated test, a Tesla operating on the latest public version of FSD struck a child mannequin while illegally passing a school bus in all eight test runs. The vehicle failed to disengage or alert the driver following the collisions.

“Self-driving software that illegally blows past stopped school buses and runs down children crossing the road must be banned immediately,” said Dan O’Dowd, founder of The Dawn Project. “It is only a matter of time before a child is killed while getting off a school bus because of Elon Musk and Tesla’s utter negligence and contempt for public safety.”

O’Dowd also criticized NHTSA for what he described as inaction on Tesla’s FSD issues. “The National Highway Traffic Safety Administration must step up and ban Tesla Full Self-Driving from public roads to protect children, pedestrians and other road users,” he said.

The Dawn Project maintains a public database of what it claims are thousands of safety-critical incidents involving Tesla’s autonomous software. The group’s new report includes footage and data from the Austin demonstration and urges immediate federal and legislative intervention.

NHTSA currently has multiple open investigations into Tesla’s self-driving features. As of the latest federal data cited in the report, there have been 2,185 crashes and 50 fatalities involving Tesla’s autonomous systems.

The Dawn Project’s report calls on lawmakers to halt the deployment of Tesla FSD until the company can prove the software is safe.

A full copy of the group’s annual report is available for public review.

checklist for Revv

The Complete ADAS Calibration Checklist: From First Scan to Final Invoice

Get the 21-page checklist to identify ADAS systems, trigger calibrations, follow OEM steps and documentation.

Send Me the Checklist

Shop & Product Showcase