But safety advocates argue such a framework is badly needed. “Most people, when they talk about safety, it’s ‘Try not to hit something,’” says Phil Koopman, who studies self-driving car safety as an associate professor at Carnegie Mellon University. “In the software safety world, that’s just basic functionality. Real safety is, ‘Does it really work?’ Safety is about the one kid the software might have missed, not about the 99 it didn’t.” For autonomous vehicles, simply being a robot that drives won’t be enough. They have to prove that they’re better than humans, almost all of the time.
Koopman believes that international standards are needed, the same kind with which aviation software builders have to comply. And he wishes federal regulators would demand more information from self-driving vehicle developers, the way some states do now.