Autonomous Vehicles Share One Thing With Humans: It’s Always the Other Guy’s Fault

Posted by

As autonomous vehicle crashes mount, supporters are quick to jump to the defense of the self-driving cars, regardless of the circumstances.

The latest autonomous vehicle crash took less than two hours from its launch. A self-driving shuttle in Las Vegas given the relatively easy assignment of completing a half-mile loop – over and over again, found a way to collide with a slow-moving tractor-trailer. The truck was backing up very slowly and the shuttle moved into its path. With disappointed supporters on hand including more than one celebrity, a statement explaining the situation was needed fast, and a Las Vegas government official provided it, saying, “The shuttle did what it was supposed to do, in that its sensors registered the truck and the shuttle stopped to avoid the accident. Unfortunately, the delivery truck did not stop and grazed the front fender of the shuttle. Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided.” From the two accounts we have read, neither indicated the shuttle beeped to alert the driver of the impending contact.

A police officer dutifully gave the truck a ticket for improper backing. Obviously, from this explanation, the truck’s human driver was at fault and the robo-shuttle absolved of its part in the fender-bender. Unless you hear a description of the events from a passenger who was actually in the shuttle at the time and saw it all unfold. The passenger said, “The shuttle just stayed still. And we were like, ‘it’s going to hit us, it’s going to hit us.’ And then it hit us. The shuttle didn’t have the ability to move back. The shuttle just stayed still.” Does it make you feel better knowing that when a crash is imminent and avoidable your self-driving car will just freeze?

This is not the first time a self-driving vehicle has collided with a semi-trailer. In May of 2016, a tragic collision occurred between a Tesla Model S operating on Autopilot and a tractor trailer. The Model S driving straight on a highway hit a tractor-trailer that had turned at a (legal) intersection ahead of it shown in the NTSB image below. A former U.S. Special Forces operator was at the wheel. Despite repeated warnings from the car to steer and be involved in the operation of the Tesla, the driver was not actively engaged in the driving as the crash unfolded. The driver never even braked. The NTSB investigation found that the car was on Autopilot and operating above the posted speed limit.

The Model S hit the truck broadside and the driver was killed. Following the accident, Tesla supporters looked for any possible way the Autopilot system may be absolved of its responsibility not to drive into a slow-moving, huge object in its path. More significantly, a Tesla employee, speaking under the condition of anonymity to the New York Times defended the technology Saying, “With any driver assistance system, a lack of customer education is a very real risk.” The Tesla employee told the New York Times “…drivers needed to be aware of road conditions and be able to take control of the car at a moment’s notice — even though he said Autopilot’s self-steering and speed controls could operate for up to three minutes without any driver involvement.”

Let’s face it, if a Navy Seal can’t grab control back quickly enough, none of us can. That defense of the technology is ridiculous. Although the truck driver was cited, the NTSB investigation put much of the blame on Tesla’s Autopilot.

In another example of a crash involving Autopilot, a driver in Montana was driving on the highway when the Tesla Model S veered into the shoulder hitting wooden stakes that had been planted there. Tesla blamed the driver in this crash, saying he wasn’t operating with his hands on the wheel and also chose a poor road on which to drive in Autopilot.

Oddly enough, it seems that the hardest challenge for autonomous vehicles is large, slow-moving objects. In 2016 a vehicle operated by a company owned by Google pulled from a stop into the path of a city bus. Google still didn’t take full responsibility. Instead, the company issued this explanation, saying, “This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.” Watch the video and see for yourself how the bus could be blamed.

These crashes we have used as examples all occurred in daylight and in dry conditions. They also occurred on paved, marked roads. How autonomous vehicles will navigate dirt roads at night in a snowstorm is hard to imagine.

Share:
John Goreham

John Goreham