It will be harder to misuse autopilot because of the huge recall


The NHTSA Investigation of Autonomy in Teslas: Progress on a System That Can Be Learned Faster than You Think

The investigations are a part of the larger investigation into multiple instances of the autopilot system in the car crashing into emergency vehicles. NHTSA has become more aggressive in pursuing safety problems with Teslas in the past year, announcing multiple recalls and investigations, including a recall of Full Self Driving software.

The agency stated in the statement that there could be an increased risk of a collision because the prominence and scope of the features controls may not be enough to prevent misuse.

Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself despite its name. The monitoring system can be easily fooled so much that drivers have been caught driving drunk or sitting in the back seat, according to independent tests.

The software update includes additional controls and alerts “to further encourage the driver to adhere to their continuous driving responsibility,” the documents said.

Autopilot, which comes standard in all Tesla vehicles, packages together a number of features including Traffic-Aware Cruise Control and Autosteer, which is only intended to be used on limited-access freeways when it’s not operating in tandem with the more sophisticated Autosteer on City Streets.

Safety advocates for years have been calling for stronger regulation of the driver monitoring system, which mainly determines if a driver’s hands are on the steering wheel.

On its website, the electric automobile company states that it is designed to help drivers who need to be ready to intervene at all times, but can’t drive autonomously. Full Self Driving is being tested by Tesla owners on public roads.

The NHTSA said that it is still open and will continue to work with the company to ensure the safest vehicle on the road.

“It’s progress,” said Mary “Missy” Cummings, a robotics expert who wrote a 2020 paper evaluating the risks of Tesla’s Autopilot system, “but minimal progress.”

The former senior advisor for safety at NHTSA is not convinced that this will be enough to prevent future incidents. She said it was very vague.

He said that the compromise would allow NHTSA to get the fix out and avoid another year of negotiations. The remedy will not be as strong as NHTSA would like to see.

Tesla’s driver monitoring system, which includes torque sensors in the steering wheel to detect hand placement and an in-cabin camera to track head movements, is inadequate and can be easily fooled, Abuelsamid said. When drivers try to trick the system by putting a weight on the steering wheel, the sensors are prone to false positives such as if the wheel fails to detect a driver’s hand if it’s holding it steady.