The National Highway Traffic Safety Administration (NHTSA) has produced its first annual report on crashes caused by driving assistance software installed on some vehicles from firms such as Tesla, Honda and Toyota, among others, being turned on and running. After reports from various manufacturers, the administration concluded that autopilot models of the electric car company founded and run by Elon Musk participated in 273 accidents out of 392 related to advanced driver assistance systems since last July.

So Tesla cars accounts for about 70% of all accidents. which was running ADAS Level 2 (Advanced Driving Assistance Systems). The administration also said that 5 out of 6 deaths associated with these types of accidents – some of them occurred in 2019 but have not yet been reported – are related to their vehicles. According to the data, Elon Musk’s models represent 60% of accidents in which one or more people were seriously injured, and approximately 85% in which someone died.

This data calls into question the operation of the level 2 ADAS system that Tesla includes in its vehicles. However, while they may seem disturbing, are not final. While the NHTSA report is based on information provided by the manufacturers themselves, there are a number of factors to consider that may indicate that Tesla Autopilot is in fact as safe as systems from other manufacturers.

The NHTSA report does not guarantee that Tesla’s Autopilot is worse than the ADAS of other vehicles.

On the other hand, the NHTSA report does not indicate whether Tesla crashes were caused by standard autopilot or FSD, a more advanced autopilot system currently in beta testing. Keep in mind that these two systems are considered Tier 2 by regulation, so it is likely that both FSD and standard autopilot are combined in the report.

And why is it relevant? This is not the same as accidents caused by the final software available in all Tesla vehicles (such as the standard Autopilot option). according to the system Full self driving, which is available after prior brand approval and is under development. The first is the final system, which should work as expected; the second, on the other hand, is a feature in development that is not yet open to the general public.

Tesla is, in fact, the only manufacturer to have such a system in public beta.. Drivers must also agree to a number of terms and conditions prior to enrolling in this program in order to access the beta. They also have to keep their eyes on the road at all times in case the software makes any wrong decisions.

It is also unclear if accidents occurred due to the fault of the driving system itself, which made the wrong decision, or due to the fault of an external factor — for example, something that happened on the road. It also does not specify whether the driver could have avoided it.

What does all of this mean? The decision that Tesla’s autonomous driving system is worse or more dangerous than others based solely on the total number of accidents in which it has been involved is wrong. error. There are many nuances or variables to take into account before making such a judgment. And the US agency report, unfortunately, does not provide a sufficient level of detail for such an analysis.

Source: Hiper Textual

Previous articleSamsung Galaxy S21 FE: will there be a “new version” instead of the S22 FE?
Next articleGoogle, Facebook, Twitter and TikTok will fight disinformation in Europe

LEAVE A REPLY

Please enter your comment!
Please enter your name here