— The National Highway Traffic Safety Administration (NHTSA) believes the sooner driverless cars hit the roads, the sooner the number of road fatalities will decrease.
The agency has a point as more than 90 percent of crashes are blamed on driver error. And while automakers are currently trying to tackle the technical challenges of driverless cars, eventually someone will need to tackle the ethical challenges of fully autonomous cars.
Researchers are looking at what happens if a driverless car approaches a truck that is clearly out of control and has to make a choice to hit the oncoming truck or a sidewalk full of pedestrians.
Should the car do everything possible to save its own occupants and therefore choose to hit the pedestrians instead of the other vehicle? And if the computer does make that choice, who will be held liable for the damage caused to the pedestrians?
Published studies indicate the majority of consumers say driverless cars should take the most ethical route to lower overall injuries and deaths to all humans. In other words, most consumers believe the roads should be filled with driverless cars that make ethical choices based on the common good for all humanity.
But when researchers asked those same consumers if they would buy a car like that, the majority said they would not because the car they buy must protect its own occupants by all means necessary.
The researchers discovered people may say they want ethical cars, but when it's time to shop for a driverless car, ethical cars are out of the question.
Humans are selfish and when everything is on the line, a driver will want their self-driving car to do everything possible to protect the driver and occupants in the car. If that means taking out a sidewalk of joggers instead of slamming into an out-of-control truck, so be it.
And even if the car chooses to protect its own occupants and hit pedestrians instead of an oncoming truck, should that be a bad thing if the technology is saving thousands of lives overall?
So who should be left to make the ethical decisions? One of two options will likely occur. It's possible auto manufacturers will be left with the task of creating software to make ethical decisions. The other option is the federal government setting the ethical standards all driverless cars must meet.
Both options are likely terrifying to most consumers who don't believe automakers and the government are the best people to choose who a car should save.
If an automaker decides to create an autonomous car that doesn't put the safety of its occupants first, that automaker may not sell many cars. But if the manufacturer does create a driverless car known for putting the safety of its occupants above all else, the overall buying public may perceive the automaker as unethical.