self driving car crash

Car Accidents Involving Autonomous Vehicles Are Not Far Off

Autonomous cars are not the stuff of science fiction anymore. As more and more car manufacturers jump on the bandwagon, their future is becoming more secure. There may soon come a time when there will be more self-driving cars on the road than conventional cars. But like every new major invention, autonomous cars will have to overcome some ethical issues before they are accepted by the public.

A study published in Science asked 1,928 research participants to consider the following hypothetical situation and asked them to answer the question:

“A self-driving car carrying a family of four on a rural two-lane highway spots a bouncing ball ahead. As the vehicle approaches a child runs out to retrieve the ball. Should the car risk its passengers’ lives by swerving to the side – where the edge of the road meets a steep cliff? Or should the car continue on its path, ensuring its passengers’ safety at the child’s expense?”

Most of the research participants said that the autonomous vehicle should be programmed to crash into something rather than run over pedestrians, even if that meant endangering the lives of its passengers. But is that really the right thing to do? Is the life of one pedestrian worth more than the lives of four or five individuals in the car?

This is the sort of moral dilemma that will continue to plague autonomous cars for a long time to come. No matter what the car is programmed to do, it will not be without controversy. There will be people on both sides of the moral fence and the debate will continue to rage. As long as these ethical issues are not solved, autonomous cars may not find broad acceptance.

But many scientists and manufacturers do not seem to be too concerned about the ethical issues. “This question of ethics has become a popular topic with people who do not work on technology,” according to Raghunathan Rajkumar, a professor of electrical and computer engineering in Carnegie Mellon University’s CyLab. He was not involved in the study.

A veteran of the university’s efforts to develop autonomous cars, including the Boss SUV that won the DARPA 2007 Urban Challenge, Rajkumar added that AI does not have the same cognitive capabilities that we humans have. Autonomous vehicles make their decisions based on speed, road conditions, weather, distance and other data gathered by a variety of sensors, including radars, LiDAR and cameras. They will decide their course of action based on their speed as well as the speed of the objects on their path.

What would a human driver do in such a situation? Most would probably choose to run over the child than risk going down the cliff. It’s just the way our instincts operate. In difficult situations, preservation of the self takes priority over all other considerations. The driver would probably be charged with involuntary manslaughter or may even be acquitted by the court.

There are an infinite number of such ethical problems and for each problem what is the most ethical course of action varies from one individual to another and from one society to another. When humans themselves cannot agree on the most ethical course of action, should autonomous cars be held to the same ethical standards as humans?

For now, at least according to the study, the answer seems to be in the affirmative. The manufacturers of autonomous cars must embed moral principles guiding their decisions in situations of unavoidable harm in the algorithms that control autonomous vehicles, according to the researchers who were involved in the research. And they must do it soon if autonomous cars are to be become universally acceptable.

What should you do if you are injured by an autonomous vehicle? Talk to reputable car accident attorneys for professional legal guidance.