Sunday, December 22, 2024

When you see the numbers, you’ll be shocked. Tesla’s Autopilot software has been involved in many more crashes than you might initially realize.

Tesla went all out with self-driving technology when it added a full set of features in the Model S to drive without human intervention. Although the Autopilot system has always been advertised as a semi-autonomous system, it’s been riddled with flaws. One of the most serious problems involves how this system interacts with stopped emergency vehicles. Unlike other semi-autonomous systems, the Tesla system relies on its cameras to drive without intervention.

It might be time for Tesla to rethink its application of the Autopilot software.

How many crashes has the Autopilot system been involved in?

We’ve all seen the images and videos of Tesla electric vehicles burned to crisps on the side of roads after a crash. It seems that collisions can be extremely detrimental to electric vehicles, which house battery packs that are prone to catching fire when damaged. That said, the Autopilot software has been involved in more deaths and injuries than you might expect. Since 2019, a total of 17 fatalities and 736 crashes have involved this system, as reported by the NHTSA.

Is the system poorly named?

It’s impossible to deny that the Tesla Autopilot system is improperly named. The name suggests you can set it and forget it, much like pilots do once they have reached their desired altitude. Unfortunately, there’s a lot more traffic on the roads than up in the air, which makes the name tough to deal with. Officially, this system is an SAE Level 2 driving system which means its semi-autonomous, even if the full package of features is included in the vehicle. The fact that this system carries a name we associate with the hands-off flying of airplanes while being only partially automated means it’s poorly named.

Tesla creates a yo-yo effect with this technology

The number of crashes is staggering, but if we simply look at the fatalities and when they took place, we find some interesting information. Although the Full Self-Driving version of Autopilot was offered in 2017, the company pulled back and removed some of the functionality via over-the-air updates. This change was done because of the rising number of crashes at the time. From 2019 until June 2022, only three of the 17 deaths were linked to the technology. Since then, the remainder of the deaths occurred.

Why did these additional deaths take place in Tesla crashes? Tesla reintroduced the Full Self-Driving version of Autopilot in June 2022. This created a change in what drivers experienced, with the number of vehicles equipped with the full package of software rising from around 12,000 to nearly 400,000 in only a year. Drivers were more than happy to pay the extra cost to have a full package of self-driving features, although Tesla has never advertised this package as a Level 5 autonomous system. It seems we once again have a poorly named package of features in Tesla vehicles.

Are any other automakers involved in driver-assistance technology crashes?

The NHTSA has been investigating these crashes since 2021. Tesla has had the most aggressive package of driver-assistance technology for much longer than any other automaker, which means more crashes involved Tesla vehicles with the Autopilot system than any other automaker. In fact, over the 807 crashes since 2021, Subaru came in second with only 23 crashes compared to the extremely high number Tesla posted. In addition to the extremely high number of crashes involving Tesla vehicles, four of the fatalities involved a motorcycle, and one involved an emergency vehicle.

Tesla is under investigation, but should they be?

Nearly 830,000 Tesla vehicles are part of the current investigation by the NHTSA. The Autopilot and Full-Self Driving packages are part of this investigation, but at no point, since they were offered has Tesla ever advertised them as fully autonomous features. In fact, like so many other automakers, Tesla has always informed drivers of the need to remain alert, aware and keep their hands near the steering wheel to take over control if the system fails. Does this put the responsibility for these crashes on the drivers behind the wheel, or should there still be some culpability by the automaker that continues to use terrible names for its technology?

All state laws require human drivers to be responsible for the operation of their vehicles at all times. This answers the responsibility question and shows that even the advanced systems Tesla offers in its Autopilot system aren’t ready to bring us fully-autonomous driving experiences.

This post may contain affiliate links. Meaning a commission is given should you decide to make a purchase through these links, at no cost to you. All products shown are researched and tested to give an accurate review for you.

Tags: , , , , ,