Tesla’s fully autonomous driving (FSD) is back under scrutiny. In November 2022 in San Francisco there was traffic accident caused by the unexpected action of the Tesla Model 3. Although only the testimony of these drivers was initially available, a recording of the event has recently been obtained, which makes it clear that the car of the electric vehicle company was primarily at fault.

According to a Tesla Model 3 driver, At the time of the accident, full autonomous driving was activated.. The problem arose when the car came to a complete stop, despite the fact that there were no obstacles ahead. While driving on the expressway, vehicles behind failed to stop, resulting in a carambola that damaged 8 units, according to authorities. Fortunately, there were no casualties, although some were injured.

California authorities report that Tesla’s fully autonomous driving was in beta 11. It then worked in Level 2 autonomous driving. What does it mean? so what partial automationSo the driver was able to intervene manually pressed the gas pedal, but did not.

Obviously, the fact that the driver indicated that he was using fully autonomous driving caused numerous criticisms of the company, led by Elon Musk. Also, of course, some local media covered the accident and took advantage of the situation to come up with alarmist headlines against Tesla’s FSD. Are they really justified?

Tesla comes under fire again

While it has not yet been fully proven that the driver’s claim is 100% true, the fact that Tesla’s fully autonomous driving has been controversial lately cannot be ignored. The point is not only that it has been in a beta state for so many years, but also that some anomalous activities reported which endangers passengers.

The problem of braking for no reason is not new. FROM electric reported that this phenomenon, commonly known as phantom braking (phantom braking), affected some owners of Tesla cars with autopilot (even without FSD testing). Basically, the autopilot mistakenly detects an obstacle ahead. When a collision is expected, autonomous braking is activated to avoid a “crash”.

At the end of 2021, reports of phantom braking on Tesla vehicles increased significantly, especially in the US. While some customers have approached the automaker for solutions, the company has limited itself to saying that they software failures. Therefore, they could fix them through an update.

However, video of the San Francisco crash showed that the problem had not been resolved. At least not entirely. It is also obvious that when phantom braking occurs while driving on a busy road, there is a huge risk of causing an accident.

California bans ‘fully autonomous driving’ name

Tesla Model X

In December 2022, the State of California passed a law preventing Tesla or any other automaker offering assisted driving features from using the name “full autonomous driving” because it can be misleading. The reason is very simple: no software today, not even Tesla’s, is reliable enough to be called that.. If any company violates this rule, it will be considered “misleading advertising”.

Undoubtedly, the recording of the accident, combined with previous reports of phantom braking and the new law passed in California, are adding to the pressure on Tesla. Those led by Elon Musk should push the accelerator to improve FSD capabilities, starting with the simplest.

Source: Hiper Textual

Previous articleEzra Miller pleaded guilty to lesser charge to avoid jail time
Next articleApple Maps will add the functionality that everyone wants most: Order food for home!

LEAVE A REPLY

Please enter your comment!
Please enter your name here