Humans use gestures to signal to cars in traffic constantly. Will automated cars be able to interpret those signals?
Studies have shown that 55% of human communication is non-verbal. Autonomous cars will need to understand gestures and body language signals that humans use in traffic situations. Standard hand signals are not just legally allowed on American roads, they are required knowledge on many states’ driving tests. Cars will need to understand what a bent arm out a car window means. They will need to know what the lowered left arm of motorcyclists means. These are actually the easiest signals automated vehicle programmers and sensor experts need to tackle. The fact is, drivers and pedestrians use head nods, waves, and other subtle motions to communicate with drivers of vehicles constantly.
At an automated vehicle symposium at the Massachusetts Institute of Technology attended by BestRide staff last year, John Leonard, MIT professor and Associate Department Head for Research in Mechanical Engineering, overviewed tests in real-world Boston traffic he had been conducting. His research has evolved past the nuts and bolts of vehicle automation to “the social ballet of driving.”
Dr. Leonard described one particular scenario he encountered that he felt was perhaps the biggest challenge to automated vehicle design teams. He asked the group to imagine trying to program an automated vehicle to understand “a police officer waving you through a red light or stopping you at a green light, two situations that are counter-intuitive.” He then asked the group to imagine this scenario with the added challenge of “blazing sunlight facing the vehicle, or that same scenario at night with the flashing lights of an emergency vehicle behind that officer.”
Automated cars are not just going to have to understand traffic signals and street signs, they are also going to have to understand human sign language that contradicts those traffic signals all the while deciding what background warnings to ignore.
Traffic police and even the general public could be taught to use standard signals with automated vehicles to simplify things. However, all communication needs to be two-way. How would the automated vehicle signal back? Imagine that you are driving in slow traffic and see another driver trying to exit a side street. You may want to signal that driver to proceed by using a small wave. In the real world, we do this all the time.
Typically, we look at the driver’s eyes and reactions to see if they will, in fact, pull out. They may wave back and smile, or they may begin to turn the wheel and go. How would we know if a driverless car saw our wave or not? Ford and Virginia Tech are working on that response methodology right now.
Using a hidden operator in an autonomous vehicle, researchers are experimenting with a light bar that signals back to pedestrians and drivers in other cars. This allows the communication to be two-way. The upshot of our future with autonomous cars is that you won’t just see them in traffic or ride in them. You will “speak” to them, and they will reply.