The Tesla Project: Driver-Assist Software for Self-Driving Autonomous Vehicles and a Safety Concern about the Science of Autonomy
The current system of self-driving requires a human driver to be prepared to take control at any moment as it makes judgment errors. According to the National Highway Traffic Safety Administration, the risk of motor vehicle safety was a result of insufficient adherence to traffic safety laws. And it warned FSD could violate traffic laws at some intersections “before some drivers may intervene.”
Two lanes of traffic on I-80 were shut down for 90 minutes after a crash snarled traffic on Thanksgiving as many people traveled to holiday events. Four ambulances were called to the scene.
But other safety experts have questioned the validity of Tesla’s safety claims. There have been high-profile accidents of cars using the more basic FSD or autopilot. Fatalities were involved in some of the accidents.
Tesla confirmed in a public filing released Monday that the US Department of Justice has requested documents concerning the company’s controversial driver-assist software systems which Tesla calls Autopilot and “Full Self-Driving.”
NHTSA is investigating Autopilot as well. That technology has lane keeping assist and adaptive cruise control to keep a car in a lane on a highway, as opposed to the promise of self-driving cars, which Musk wants to one day be able to operate a vehicle on a city street.
It has been controversial. The National Transportation Safety Board previously found that the technology was partially to blame in a fatal crash.
Tesla claims that Autopilot is safer than ordinary driving, but autonomous vehicle experts say the data chosen by Tesla to support its safety claims compares apples and oranges, and isn’t the best measure of the safety of the systems.
Despite their names, “full self- driving vehicles” are not capable of driving themselves and owners need to be present at all times.
Waymo, the self-driving subsidiary of Google’s parent company, stopped using the term “self-driving” in January 2021 because it said the phrase was being used inaccurately, giving the public a false impression of what driver-assist systems are capable of.
Tesla adversarial actions: a case study of excessive speed, alcoholism, and steroid use in electric vehicle crashes
The National Highway Safety Board concluded that the probable cause of the Spring, Texas, electric vehicle crash had to do with excessive speed, intoxication from alcohol, and the effects of two steroids, resulting in a roadway departure, tree impact.
The investigators of the National Transportation Safety Board have determined that there was a person in a driver’s seat before the crash and that autopilot wasn’t in use. Their findings included security footage showing the two men entering the 2019 Tesla Model S P100D and sitting in the front seats of the vehicle before leaving. The data retrieved by the car maker showed that the seatbelts were buckled up when the car was being driven.
The Model S had more information stored in its event data recorder, which was used in the NTSB report. The Model S accelerated from 39 to 67mph in just two minutes and traveled 57mph without stopping before hitting a tree. It also determined that seatbelts had their pretensions activated, and the airbags were deployed. The fire was caused by the damage to the front of the battery module.
Some of the situations that are listed in the documents as being of concern to NHTSA are: navigating intersection during a bright yellow light, how long cars stop at a stop sign, and parking at a stop sign.
The word call for a software update is anachronistic and incorrect according to Musk who has not commented on the nature or scope of the problem.
All four of theTesla models have problems with the current version of the software, according to the notice.
The fact that premiums for the features are paid by drivers is one of the reasons why the company considers FST to be key to its basic business plan. Musk and his company have claimed for months that cars driven solely by humans are safer than cars driven by FSD. He told investors last month that Tesla has collected data from about 100 million miles of drivers using FSD outside of highways.
“Mere failure to realize a long-term, aspirational goal is not fraud,” Tesla’s lawyers wrote in a November 28 court filing, asking that the suit be dismissed.
Autonomous Driving in the Light of Musk, Autopilot, FSD and Tesla: A NHTSA Message to Drivers with Self-Driving Beta
When CNN Business spoke to 13 people with cars with the full self-driving beta, the majority said it wasn’t worth $15,000. And it’s been the subject of controversy for years, including a recent ad that played during the Super Bowl in a few markets.
NHTSA requested additional information after Musk stated that drivers may be encouraged to keep their hands on the steering wheel.
Update February 16th 2:44PM ET: Updated to include more details about Autopilot, FSD, and Tesla’s over-the-air software updates. Also to include a message from Musk.
According to the agency’s filing, those include driving through a yellow light on the verge of turning red; not properly stopping at a stop sign; speeding, due to failing to detect a road sign or because the driver has set their car to default to a faster speed; and making unexpected lane changes to move out of turn-only lanes when going straight through an intersection. Drivers will be able to use the feature when a patch is made for the defects.
Humans do not work that way, says Philip Koopman, who studies self-driving car safety as an associate professor at Carnegie Mellon University. He believes that the technology has an issue with people being trained to believe that the car is doing the right thing, if they have a short reaction time. The car buzzes when it determines that the driver needs to take over.
The documents said the system may not adequately respond to changes in speed limits or may not account for the driver’s adjustments in speed.
Model Y, Model S, and Model 3 Cars Are Not Guaranteed to Be Securing Their Seat Back Frames in a Collision
The recall covers Model S and Model X vehicles as well as model 3 and Model Y vehicles that have the software but haven’t been installed yet.
The National Highway Traffic Administration said that bolts that are loose could potentially cause seat belts to not work in a crash.
On Model Y vehicles, the second-row driver- and passenger-side seat back frames are secured with four bolts per seat back. But during production for certain Model Y cars, one or more of the bolts securing the seat back frames to the lower seat frame “may not have been torqued to specifications.”