Tesla driver’s seat should stop at a red light: Elon Musk’s Tesla Almost ran a Red Light during a demonstration of Full Self-Driving beta
Elon Musk’s Tesla almost ran a red light while livestreaming a demo of Tesla’s Full Self-Driving (FSD) beta software. Musk was also livestreaming the demo while in the driver’s seat, violating Tesla’s rules for its advanced driver-assist technology. He kind of doxes him, as well.
The roughly 45-minute video was meant to demonstrate the prowess of v12 of Tesla’s advanced driver-assist technology, which has yet to be released to customers. And while the vehicle appears to be operating safely for the majority of the trip, it still ends up being a bizarre experience — which is typical of all things Musk.
Musk has said that FSD is being tested as beta software to emphasize the need for drivers to pay attention to the road while using the driver-assist feature. The label of the car could allow it to evade legal liability in case of a crash.
In the moment when Musk was forced to stop at the traffic light, he was seized upon by critics who said the approach of autopilot was insufficient and reckless.
Full Self-Driving (Beta) is a hands-on feature. Keep your hands on the steering yoke (or steering wheel) at all times, be mindful of road conditions and surrounding traffic, and always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.
Why should Palo Alto Police do anything more? When Elon Musk tries to kill Mark Zuckerberg, his house, and his video on the fight for justice
The video is often low quality. It often flips between vertical and horizontal filming. And Musk frequently comments that he hopes someone can edit the footage to make it more interesting.
At around the 27-minute mark, Musk claims he is going to drive to Meta CEO Mark Zuckerberg’s house, which he has previously threatened to do as part of their much-publicized (but probably will never happen) fight.
Musk Googles Zuckerberg’s address and then displays it prominently on-screen. (Remember, Musk has banned the @ElonJet account that tracks his private jet from X/Twitter, claiming it was a “direct personal safety risk” to him.)
“Had an officer observed the driver with the phone in their hand, they could have issued the driver an infraction ticket for violating California’s handsfree law,” writes Palo Alto PD Captain James Reifschneider.
There’s no question that Musk was in control of the vehicle: he was forced to stop his “Full Self Driving” system from running a red light partway through the livestream, and he reveals that he’s in the drivers seat by turning the camera on himself near the 30-minute mark.
Let me be clear: I’m pretty sure Palo Alto Police have better things to do than chase down the world’s richest man for a $20 fine. (That’s the only punishment for a first offense — you can get a point against your driving record for a second offense, but only if it happens within three years of the first violation.)
But Musk has been known to repeatedly flout the law — see my linkbox — and some are beginning to question his power. A piece of reporting by Pulitzer Prize-winning Journalist Ronan Farrow shows that the US government was forced to treat Elon Musk with respect after it was shown that his Starlink satellites became so important in the Ukrainian war that it could have changed the course of the conflict.
The NHTSA Investigation of Advanced Driver-Assisted Autonomous Steering Feature in Collision Detectors and Other Driver Monitoring Systems
Reifschneider, the police captain, says that there are practical reasons why the department doesn’t ticket without personally observing a driver — they need to be able to tell a judge what they saw, verify the driver’s identity and driver’s license, and collect a license plate or VIN number for the vehicle to support the citation.
“The officer needs to be prepared to testify in court about what they personally observed (namely, that they saw the phone in the driver’s hand),” he writes.
The National Highway Traffic Safety Administration was interested in a version of Autopilot that allows drivers to use advanced driver-assisted feature without applying too much Torque to the Steering wheel.
The letter was filed in the same way as the NHTSA’s investigation into the over a dozen incidents where cars with autopilot crashed into stationary emergency vehicles. That investigation is expected to wrap up relatively soon. Regulators are also investigating issues with Tesla’s seatbelts, steering wheels, and “phantom braking” triggered by the driver assist.
The existence of this feature is known to the public so NHTSA is concerned that more drivers will try to use it. The relaxation of controls designed to ensure the driver is engaged in the driving task could result in more driver inattention and failure to properly supervise Autopilot.
Those systems rely on a robust driver monitoring system that includes cameras and other sensors to ensure drivers keep their eyes on the road. And they are only available to use on certain roads, like divided highways. Tesla allows its customers to use Full Self-Driving (FSD), for example, on local roads, but still requires their hands to be on the steering wheel.