Skip to content

Archives

Former Uber self-driving chief crashes his Tesla on FSD

  • Former Uber self-driving chief crashes his Tesla on FSD

    This is actually a really good article about Tesla, "full self-driving" (FSD), supervision, automation, risk and liability:

    Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.

    The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that.

    Krikorian cites an Insurance Institute for Highway Safety study showing that after just one month of using adaptive cruise control, drivers were more than six times as likely to look at their phones. Tesla’s own website warns FSD users not to become complacent, but the system’s smooth performance actively trains that complacency.

    He points to two well-known crashes to illustrate the impossible math. In the 2018 Mountain View accident that killed Apple engineer Walter Huang, the driver had six seconds before his Tesla steered into a concrete median. He never touched the wheel. In the 2018 Uber crash in Tempe, Arizona, sensors detected a pedestrian with 5.6 seconds of warning, but the safety driver looked up with less than a second remaining.

    In Krikorian’s own case, he did take action, but he was asked to snap from passenger back to pilot in a fraction of a second, overriding months of conditioning. The logs show he turned the wheel. They don’t show the impossible math of that transition.

    The pattern Krikorian describes should sound familiar to anyone who has followed Tesla’s FSD controversies: condition the driver to rely on the system, erode their vigilance through months of smooth performance, then point to the terms of service and blame them when something breaks. When FSD works, Tesla gets credit. When it doesn’t, the driver gets blamed.

    Tags: fsd tesla risk attention supervision liability driving safety vigilance automation