Tesla always has an answer ready when one of its self-driving vehicles crash.
In May of 2016, the Tesla Model S luxury sedan in which Joshua Brown was traveling was speeding when it drove into the side of a slow-moving tractor-trailer in broad daylight. We hesitate to call Mr. Brown the driver because, at the time of the fatal accident, Mr. Brown wasn’t driving. The car was. His Tesla Model S, which he had nicknamed Tessy, was operating autonomously using Tesla’s Autopilot system. The truck up ahead had begun its slow turn at a legal intersection well before there was any danger. However, the combination of the speed of the Model S and the fact that nobody (or no thing) in the Tesla was looking at the road ahead resulted in tragedy. The Tesla never even applied the brakes. Amazingly, Tesla advocates – including Mr. Brown’s family– immediately pointed to the truck and its driver as the cause of the accident. Now that same sad scenario has happened again, but this time it is a Tesla Model X luxury minivan operating on autopilot hitting a barrier on the highway. Tesla is blaming “the driver.”
Following the 2016 Autopilot crash, the NTSB completed both a preliminary and follow-up investigation. Its conclusion contained this statement from NTSB Chairman Robert Sumwalt. “System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.” One of Autopilot’s lacking safeguards the NTSB pointed to include, “The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.”
The 2016 Autopilot crash also brought to light one of the fallacies of systems like Autopilot. Telsa claims that a driver is supposed to remain attentive while using Autopilot mode and be ready to instantly retake control of the vehicle if anything dangerous suddenly happens. This belies the fact that many occupants of vehicles being controlled by Autopilot don’t remain attentive, and in fact, do things like film themselves inside the vehicle with their hands off the wheel while it operates. Joshua Brown was one of those people. But Joshua Brown was not a typical person by any means. He was a tech-savvy person who worked in high tech and also a former U.S. Special Forces member. If Joshua Brown can’t grab back control of a Tesla on Autopilot quickly enough to prevent an accident, who can?
The facts in the Model X crash that took place this past week in California are still developing. At first, nobody knew if the Tesla was being driven, or if it had been operating in Autopilot mode. Tesla answered that in a series of public statements over the past couple of days. Tesla says that the Model X was in Autopilot mode, that the driver had been warned multiple times to put his hands on the wheel, and that prior to the impact, the driver had not touched the steering wheel in six seconds. Remember those “safeguards” that the NTSB mentioned were lacking in 2016?
Tesla expressed sadness over the death of the occupant in the latest crash. However, like in all previous Autopilot-related incidents was quick to point the blame at anything and anyone but Autopilot. Tesla said, “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds before the collision.” Tesla’s spokesperson went on to add that the occupant had, “about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.” Of course, the Autopilot system also had 150 meters to take action that would have prevented the crash as well. Vehicles a third of the cost of a Tesla with Autopilot now come with distraction detection.
The NTSB’s spokesperson, Christopher O’Neil, expressed the agency’s displeasure that Tesla is trying to get out in front of the story, shape the story, and assess blame. O’Neil said, “The NTSB is unhappy with the release of investigative information by Tesla. The NTSB is looking into all aspects of this crash including the driver’s previous concerns about the Autopilot.”
The “prior concern” that the NTSB is referring to is a report that the occupant in the Model X crash told Tesla’s service center that his Model X was acting funny prior to the crash and that Tesla had inspected the vehicle. Family members say that the occupant told Tesla that during past commutes, the vehicle would “swivel toward that exact barrier.”
Self-driving vehicles and autonomous vehicles are now under the microscope. With an Uber self-driving vehicle having killed a woman and a self-driving Chevy being ticketed for ignoring a pedestrian in a crosswalk this very week, a tipping point maybe be coming for this technology.
Image and Source Credits
Top of page image courtesy of Florida Highway Patrol and NTSB Report HWY16FH018
Second crash image source ABC News Youtube Post (embedded in the story)