Driverless Cars May Let You Decide who Survives a Car Crash

Are you comfortable with the idea of driverless cars? Imagine riding along with artificial intelligence making all the critical traffic decisions that might jeopardize the health of you and your family. This is something that we have to get used to if we are going to let machines drive our cars.

So this gets us to the real question, and that is would you be willing to ride in a vehicle that might decide to kill you? Consider the idea of your self-driving car having an “ethical knob” which gives your car the permission to decide who is most important. In other words, you can tell your car to sacrifice your own well-being for the good of others on the highways. Of course, you could tell your cars to let others die before you do. Still feeling good about driverless cars?

Will Your Driverless Cars Make the Proper Moral Decision?

The ethical conundrum of the ways in which these self-driving cars will need to evaluate moral options poses to be a massive problem for these manufacturers. Whenever a human drives a car, their instinct will always govern how they react to dangers and risks. Thus, whenever a horrible crash occurs, it is just about always very clear who the responsible party is.

The issue is if these cars drive themselves, then they are not able to rely on their instincts, they can only refer to their machine code. And God forbid, whenever the very worst thing occurs, do we blame the software engineers, the car manufacturer, or do we blame the owner of the car who is typically responsible?

Controversial Question has Created a Variety of Responses

There is one thing for sure about this dilemma, the responses regarding this issue has been both plentiful and complicated. There was actually a study in 2015 that discovered most people felt that driverless vehicles ought to be utilitarian. In other words, these cars should always choose the option that minimizes the level of overall danger and harm to human life – even if this means sacrificing their own passengers. The interesting thing was even though most people made this choice, they claimed that they would ever get into a vehicle that might choose to kill them.

 “We wanted to explore what would happen if the control and the responsibility for a car’s actions were given back to the driver,” claims Guiseppe Contissa from Italy’s University of Bologna, who believes that he and his associates have come up with a resolution.

This research team has designed and created dials which will offer settings on these cars ranging from ”full egoist” to “full altruist”, and also has setting in the middle that is impartial. They believe that this ethical knob could work not just for self-driving vehicles, but also used in other industries that are rapidly becoming autonomous.

“The knob tells an autonomous car the value that the driver gives to his or her life relative to the lives of others,” claims Contissa. “The car would use this information to calculate the actions it will execute, taking in to account the probability that the passengers or other parties suffer harm as a consequence of the car’s decision.”

One concern about this issue is that individuals might be unwilling to assume the moral responsibility required for this approach. If everyone chooses this impartial option, then this ethical knob is going to help at all in resolving this dilemma.