US automobile safety regulators have opened an investigation into Tesla vehicles equipped with its full self-driving technology over traffic-safety violations after a series of crashes.
The National Highway Traffic Safety Administration (NHTSA) said the electric carmaker’s self driving assistance system, which requires drivers to pay attention and intervene if needed, had “induced vehicle behaviour that violated traffic safety laws”.
The preliminary evaluation by the NHTSA is the first step before potentially seeking a recall of the vehicles if it believes they pose a risk to safety.
The agency said it had received reports of 2.88m Tesla vehicles driving through red traffic lights and driving against the proper direction of travel during a lane change while using the system.
NHTSA said it has six reports in which a Tesla vehicle, operating with full self-driving (FSD) engaged, “approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash with other motor vehicles in the intersection”.
The agency said four crashes had resulted in one or more injuries. Tesla did not immediately respond to a Reuters request for comment.
The NHTSA said it has identified 18 complaints and one media report alleging that Tesla vehicles, operating at an intersection with FSD, engaged “failed to remain stopped for the duration of a red traffic signal, failed to stop fully, or failed to accurately detect and display the correct traffic signal state in the vehicle interface”.
after newsletter promotion
Some complainants also said FSD “did not provide warnings of the system’s intended behaviour as the vehicle was approaching a red traffic signal”.
Tesla’s FSD, which is more advanced than its Autopilot system, has been under investigation by NHTSA for a year.
In October 2024, the agency began an inquiry into 2.4m Tesla vehicles equipped with FSD after four reported collisions in conditions of reduced visibility, such as sun glare, fog or airborne dust. One of these collisions, in 2023, was fatal.
Tesla’s website says FSD is “intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”
Reuters contributed to this report
Quick Guide
Contact us about this story
Show

The best public interest journalism relies on first-hand accounts from people in the know.
If you have something to share on this subject, you can contact us confidentially using the following methods.
Secure Messaging in the Guardian app
The Guardian app has a tool to send tips about stories. Messages are end to end encrypted and concealed within the routine activity that every Guardian mobile app performs. This prevents an observer from knowing that you are communicating with us at all, let alone what is being said.
If you don’t already have the Guardian app, download it (iOS/Android) and go to the menu. Select ‘Secure Messaging’.
SecureDrop, instant messengers, email, telephone and post
If you can safely use the Tor network without being observed or monitored, you can send messages and documents to the Guardian via our SecureDrop platform.
Finally, our guide at theguardian.com/tips lists several ways to contact us securely, and discusses the pros and cons of each.