Daniel McGehee is the director of the National Advanced Driving Simulator Laboratories (NADSL) at the UI, and an associate professor in the departments of Industrial and Systems Engineering, Emergency Medicine, Public Health and Public Policy. He leads a team of faculty, staff, graduate and undergraduate students in an interdisciplinary transportation research program that studies human factors, automotive safety and injury. Previously, he was the director of the Human Factors and Vehicle Safety Research Division at the UI Public Policy Center, and has worked for UI since 1993.
McGehee’s UI3 AI Symposium presentation focused on the history of automation in the transportation industry since 1994 when the first study was launched. “Progress stalled on March 23, 2016 when Joshua Brown died in a partially automated Tesla crash near Williston, Florida,” he said. In the 41 minutes prior to the crash, Brown ignored seven warnings issued by the car’s control panel. He died instantly when his car drove under a semi-tractor trailer parked across the road. “It was a “shearing” event, unlike it would have been if he had driven into a concrete barrier, pedestrian or another vehicle traveling in the same or opposite direction (hazards that the Tesla’s algorithms were trained to identify),” said McGehee. “The car didn’t even brake, and traveled for another quarter mile without its top,” he added.
Uber was testing autonomous vehicles, but they stopped in March, 2018 when a pedestrian was killed in Tempe, Arizona. The modified Volvo’s dash-cam showed the safety driver was distracted by her mobile phone; she looked up in horror just as the car, without braking or warning, struck a woman walking a bicycle across the street. “While its algorithms were trained to identify pedestrians and bicycles as hazards, it didn’t recognize the unique combination of pedestrian pushing a bike that had shopping bags,” said McGehee.
Machine learning is the key to safety, and training the algorithms takes time and experience under every possible condition. Speed is sacrificed with complexity, and slowness frustrates drivers and pedestrians, alike. “In New York, if you put an autonomous vehicle on the street, traffic would grind to a halt,” he said.
McGehee cited a case of autonomous busses at Disneyland-type resorts; they’re programmed to see lots of pedestrians and vehicles. In rural areas, autonomous vehicles should be able to run faster since there are fewer hazards. But some rural hazards—deer and farm equipment in the road, for example—might be difficult to account for.
Dominos and Kroger Food are experimenting with autonomous delivery vehicles, but McGehee believes it’s disingenuous of them to claim autonomy. “There are always safety drivers on board that can override the AI when a hazard presents itself,” he said.
McGehee’s lab is experimenting with partially automated vehicles with Iowa City/Cedar Rapids commuters. Their experience is captured with sophisticated cameras and vehicle instrumentation.
“Ethics, legal and safety considerations will govern the progress of autonomous transportation,” said McGehee who predicts that full automation is decades away from ubiquity. “By 2040, ~ 90 percent of cars on the road will have today’s production advanced driver assistance system technology. It is hard to predict when fully autonomous cars will be able to operate in all conditions,” he added.