IIHS Voices Serious Concerns About Level 2 Autonomous Driving Technology Now Available

Posted by

In a recent study, IIHS found many causes for concern with Level 2 “self-driving” Automation from Tesla and other automakers.

 

The Insurance Institute For Highway Safety (IIHS) released four separate special issue stories on August 7th, 2018 following a special report the group did focusing on autonomous vehicle technology now on the market. The stories by IIHS take the form of very detailed press releases intended to inform the public about the group’s findings. As close followers of IIHS, we can say with certainty this flurry of activity is unusual for the group. The main theme of all four special issue stories is that what is termed “Level 2” automation now found in vehicles does not meet the group’s (relatively high) standards for safety. Level 2 vehicle autonomy includes systems that allow the vehicle itself to control the braking, steering, and acceleration without driver input in certain situations and for limited amounts of time.

Looking for a great new or used vehicle? Start your search at BestRide.com.

Tests Uncover Safety Issues And Wide Variations In Performance

IIHS has been following the advancement of self-driving technology closely. It has tested vehicles with the technology and also researches crash reports and owner safety reports about vehicles with the technology.  Jessica Jermakian, IIHS senior research engineer explained what the group did next with this knowledge. “We zeroed in on situations our staff have identified as areas of concern during test drives with Level 2 systems, then used that feedback to develop road and track scenarios to compare vehicles.” IIHS says that its results from testing models by BMW, Mercedes Benz, Tesla, and Volvo ranged from annoying to dangerous.

In one test, vehicles were driven at 31 mph toward a stationary vehicle target with ACC off and autobrake turned on to evaluate autobraking performance. The two Teslas tested hit the stationary target in this test. The remaining three vehicles did not hit the target. In other tests, the Tesla models performed the best and the most smoothly. In one test in which a Mercedes E-Class was traveling about 55 mph with Adaptive Cruise Control (ACC) and active lane-keeping engaged but not following a lead vehicle, the E-Class system briefly detected a pickup truck stopped at a traffic light ahead but quickly lost sight of it and continued at full speed until the IIHS tester hit the brakes. “At IIHS we are coached to intervene without warning, but other drivers might not be as vigilant,” Jermakian says. “ACC systems require drivers to pay attention to what the vehicle is doing at all times and be ready to brake manually.”

IIHS also evaluated auto-steering systems by the four brands. Auto-steering allows a vehicle to follow the road without help from the driver. Some models did well, including one Tesla. However, BMW’s system “…steered the vehicle toward or across the lane line regularly, requiring drivers to override the steering support to get it back on track. Sometimes the car disengaged steering assistance on its own. The car failed to stay in the lane on all 14 valid trials.”

Tesla Model X Fatality Highlights Risks And Lessons Unlearned

IIHS has tested Autopilot from Tesla in the past and the group says that it found “…Autopilot may be confused by lane markings and road seams where the highway splits.” Two Tesla crashes highlight this issue. A September 2017 crash by a Tesla Model S was the first. Luckily the driver survived. In the second, a Tesla Model X crashed into a lane divider and was tragically killed. This was not the first fatality involving Tesla’s Autopilot. That occurred in 2016 when a Tesla struck the side of a slow-moving tractor-trailer while Autopilot was engaged.

IIHS was not the first safety group to document Autopilot’s behavior and make suggestions for safety changes. Following the fatal crash involving the truck, the NTSB created a report that included many areas for improvement in Tesla’s self-driving technology. One of the specific things NTSB singled out in 2017 was that “The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.” Following the later fatal crash (in 2018) Tesla’s data logging system revealed that “The system gave Huang (the driver) two visual alerts and one auditory alert to place his hands on the wheel during this period. In the final 6 seconds before impact, his hands weren’t detected on the wheel, and the Tesla didn’t make any emergency braking or steering maneuvers to avert the crash.” It appears as if the NTSB found a defect in the way Autopilot operates, or the way in which experienced users interact with the system, and it was not corrected. A further tragedy resulted.

Since the first Tesla crash occurred involving the vehicle hitting a truck broadside in 2016, Tesla vehicles with Autopilot active have slammed into the back of two firetrucks and one full-size police SUV parked with emergency lights active. Tesla vehicles have also hit the highway barriers mentioned above and just this week a driver reported that he thought Autopilot was engaged when his Tesla hit the back of a third parked firetruck. In May of this year, BestRide attended a symposium at MIT on the topic of autonomous vehicle technology. One of the panelists and keynote speakers was MIT Research Scientist, Bryan Reimer, Ph.D. Dr. Reimer has been working with Tesla on a long-term “big data” study of Autopilot. We asked Dr. Reimer why so many crashes involving Teslas hitting big obvious things have happened. He answered, “We know there is an issue with the detection of static objects.” He went on to add that Tesla needs to “Stop this nonsense.” You can watch the exchange in the video above. It starts at about timestamp 1:24:15.

Poor Governmental Oversight Makes Autonomous Vehicles and Testing Unsafe

With NTSB, IIHS, and even researchers working on Tesla Autopilot knowing full well that there is a problem, why hasn’t it been fixed? IIHS in its most recent study titles one of its focus points “Lax U.S. oversight of industry jeopardizes public safety.” The group says that oversight is now being attempted by a patchwork of state laws and voluntary federal guidelines. “We don’t want to hamstring the development of autonomous vehicles but do want to ensure that all motorists, bicyclists and pedestrians sharing the road are protected,” said David Harkey, IIHS president. IIHS feels that the crash data should be made public. IIHS also has asked that legislation on the regulation be slowed down until more is known about the fatal crashes that have already occurred.

You can read the full IIHS report at the Institute’s website. We have embedded the focus stories in the text of our article above.

Image of Tesla crashed into police vehicle courtesy of the Laguna Beach PD.

Top of page image of Tesla operating on Autopilot courtesy of Youtube and Eddie Daniels.

Share:
John Goreham

John Goreham

Leave a Reply

*