When new technology goes badly wrong, humans carry the can

This mindset prompts Dr Elish to coin the term “moral crumple zone” to describe the role assigned to humans who find themselves in the positions that the Three Mile Island operators, the Air France pilots – and the safety driver in the Uber car – occupied. It describes how responsibility for an action may be wrongly attributed to a human being who had limited control over the behaviour of an automated or autonomous system.

“While the crumple zone in a car is meant to protect the human driver,” she writes, “the moral crumple zone protects the integrity of the technological system, at the expense of the nearest human operator.