Uber may have been seeking “to reduce the number of ‘false positives,’ where the computer potentially misclassifies a situation and the automatic emergency braking engages unnecessarily,” said Costa Samaras, an automated vehicle expert and assistant engineering professor at Pittsburgh’s Carnegie Mellon University. “False positives like that could also be dangerous, especially at higher speeds.”
Samaras said that instead of false positives, there appeared to be a false negative in this case.
“The car saw the pedestrian six seconds before impact but misclassified them until 1.3 seconds before impact,” he said. “Even at that point, the computer determined that emergency braking was needed, but the function was disabled and there is no mechanism to alert the driver.”
“We know that humans are a terrible backup system. We’re easily distracted, and we have slower reaction times,” Samaras added. “Alerting the driver to these types of situations before the crash seems like a no-brainer.”
More>>