George Clemons

Programmed to Kill or “Would You Rather”

We will all soon take part in a game of “Would You Rather.” According to the rules of the game, we will be given a dilemma in the form of a question beginning with “Would You Rather.” The dilemma can be between two good options or two bad options. Then, we have to choose an answer. Our dilemmas will come up on a daily basis, and will have to be resolved.

While driverless car technology is still in its infancy, there is considerable interest helping to advance and improve this technology at a rapid rate. Some of the appeal and push for driverless cars lies in the fact that 95% of all traffic accidents are caused by human error. It is widely believed that many accidents can be eliminated when cars become driverless. However, unavoidable accidents will still occur. That’s when our game of “Would You Rather” comes into play.

Current software and sensors used by driverless cars can only make very basic distinctions between objects, such as the difference between a pedestrian and a bicyclist. But, jump a few years when object recognition systems are expected to recognize and distinguish between objects like baby strollers, shopping carts, and boulders. This new technology will present moral dilemmas that we need to try to address and resolve in advance.

Would You Rather… hit a boulder that is falling into your lane or hit a woman pushing a baby stroller into your lane? Striking the baby stroller will probably cause less damage to me in my driverless car, but would kill or severely injure an infant. If my car strikes the boulder instead, then I am more likely to be injured.

This example illustrates the need for the development of algorithms to determine risk and the value of a human life. In many cases, the answer will be clear – the near-certain death of an infant outweighs any possibility of injury to me in my driverless car. But, what if I had my infant son in the car? The answer may not be as clear in that case.

Another good example of a moral dilemma was presented by Jason Millar in his article titled, An ethical dilemma: When robot cars must kill, who should pick the victim?  “You are travelling along a single lane mountain road in an autonomous car that is fast approaching a narrow tunnel. Just before entering the tunnel a child attempts to run across the road but trips in the center of the lane, effectively blocking the entrance to the tunnel. The car has but two options: hit and kill the child, or swerve into the wall on either side of the tunnel, thus killing you. How should the car react?”

Someone will have to answer these sticky questions. I believe there are too many moral dilemmas to resolve in advance, and we may not think of everything. So, as driverless cars become more common, and the situations they encounter become more diverse, we will learn, and our decision-making algorithms will change.

Because cars are the primary method of transportation for most Americans, the transition to driverless cars will have a huge impact on our society. There is certain to be litigation regarding the moral dilemmas presented. Consumers understandably want any software used by driverless cars to be as safe as possible, regardless of the financial cost.

And, while there may be less lives lost, it may be a while before we are completely comfortable with the safety decisions made by driverless cars. The question we may end up asking ourselves is, “Would I Rather… walk to work today, or take a driverless car?”

Exit mobile version