The US traffic authority has launched a new investigation into Tesla's “Autopilot” driving assistance system. It examines the question of whether an “autopilot” update starting in December is enough to address the authority's security concerns. In a multi-year investigation, the National Highway Traffic Safety Administration (NHTSA) concluded that Autopilot made it too easy for drivers to relinquish complete control of the system, even though they had to constantly monitor the road situation. traffic.
956 accidents from 2018 to 2023
NHTSA analyzed a total of 956 crashes from January 2018 to August 2023. 29 of them resulted in fatalities. In many cases, accidents could have been avoided if drivers had paid attention, the authority stressed in its report published on Friday. In 59 of 109 crashes for which there was sufficient data to perform such an analysis, the obstacle was visible at least five seconds before the crash. As an example, NHTSA cited an accident in March 2023 in which a minor exiting a school bus was struck by a Model Y and seriously injured.
With the online update carried out as an official recall campaign, Tesla introduced, among other things, additional information for drivers. The electric car maker points out that “Autopilot” does not turn a Tesla into an autonomous car and that people behind the wheel must be prepared to take control at any time. The US accident investigation agency NTSB warned that drivers were relying too much on the technology.
Gaps in Tesla aircraft data collection
NHTSA also noted in its report that there are gaps in Tesla's vehicle data collection that make it difficult to determine the actual number of Autopilot accidents. In most cases, the car manufacturer only receives accident data when the airbags or seat belt tensioners deploy. According to general accident statistics for 2021, this only occurs in 18 percent of all collisions reported to the police.
Furthermore, a prerequisite for data transmission to Tesla is that a mobile network is available and that the antenna is functional after the accident. In many cases, electric cars burn out after an accident because the batteries burst into flames.
Confusion over the word “autopilot”
Highway safety authority NHTSA also criticized the system's name. The term “autopilot” could lead drivers to overestimate the capabilities of the software and trust it. American drivers can currently use an advanced version of “Autopilot” called “Full Self-Driving” as a trial version. However, even FSD does not officially turn the car into an autonomous vehicle and requires constant human attention. Tesla recently added the word “supervised” in parentheses to the name. Company boss Elon Musk again promised self-driving Tesla cars this week. He wants to introduce a robotaxi in early August.
The standard “autopilot” system can maintain speed and distance to the vehicle in front, as well as the lane. The FSD version should also control traffic lights, stop signs, and right-of-way rules at intersections, among other things.
According to the report, U.S. Senators Edward Markey and Richard Blumenthal asked NHTSA to limit the use of Autopilot only to roads for which the system was designed.