Self-driving car proponent says vehicles need a conscience

The idea that cars will someday drive themselves used to be a notion limited to science fiction. But anyone who has been following the news in recent years knows that the day of the autonomous vehicle is now nearly at hand. Indeed, we even wrote about it in a post back in June.

In that item we took note of the fact that there is now a semi-autonomous semitrailer truck being tested on roads in Nevada. And we observed that some experts working on advancing self-driving vehicle technology suggest that there’s a risk that turning over too much control to computers and removing the human factor completely might be less than ideal.

Well, recently we learned that there are others in this field who really appreciate that perspective. They acknowledge that there can be circumstances in driving when ethical questions arise and demand split-second answers. They wonder whether computers can be programmed to meet such challenges.

In essence, the question being posed is, how should self-driving vehicles be programmed to make ethical choices? For example, what if a vehicle is confronted with a situation in which an accident is unavoidable and so is human injury to either someone outside or inside the vehicle? Should it be programmed to make minimizing any loss of life the priority, even if it means sending the occupants of the vehicle to their deaths?

Like it or not, these are decisions that human drivers can be forced to make. In the event of accidents that cause injury or death, there are legal remedies available for seeking compensation and accountability.

The autonomous vehicle experts say that unless the issue of conscience can be resolved satisfactorily, social acceptance of self-driving cars could be stymied — no matter how much safer they make motoring travel.

What do you think?