Caroline Lester, The New Yorker; A Study on Driverless-Car Ethics Offers a Troubling Look Into Our Values
"The U.S. government has clear guidelines for autonomous weapons—they
can’t be programmed to make “kill decisions” on their own—but no formal
opinion on the ethics of driverless cars. Germany is the only country
that has devised such a framework; in 2017, a German government
commission—headed by Udo Di Fabio, a former judge on the country’s
highest constitutional court—released a report
that suggested a number of guidelines for driverless vehicles. Among
the report’s twenty propositions, one stands out: “In the event of
unavoidable accident situations, any distinction based on personal
features (age, gender, physical or mental constitution) is strictly
prohibited.” When I sent Di Fabio the Moral Machine data, he was
unsurprised by the respondent’s prejudices. Philosophers and lawyers, he
noted, often have very different understandings of ethical dilemmas
than ordinary people do. This difference may irritate the specialists,
he said, but “it should always make them think.” Still, Di Fabio
believes that we shouldn’t capitulate to human biases when it comes to
life-and-death decisions. “In Germany, people are very sensitive to such
discussions,” he told me, by e-mail. “This has to do with a dark past
that has divided people up and sorted them out.”
The decisions
made by Germany will reverberate beyond its borders. Volkswagen sells
more automobiles than any other company in the world. But that
manufacturing power comes with a complicated moral responsibility. What
should a company do if another country wants its vehicles to reflect
different moral calculations? Should a Western car de-prioritize the
young in an Eastern country? Shariff leans toward adjusting each model
for the country where it’s meant to operate. Car manufacturers, he
thinks, “should be sensitive to the cultural differences in the places
they’re instituting these ethical decisions.” Otherwise, the algorithms
they export might start looking like a form of moral colonialism. But Di
Fabio worries about letting autocratic governments tinker with the
code. He imagines a future in which China wants the cars to favor people
who rank higher in its new social-credit system, which scores citizens
based on their civic behavior."
Issues and developments related to ethics, information, and technologies, examined in the ethics and intellectual property graduate courses I teach at the University of Pittsburgh School of Computing and Information. My Bloomsbury book "Ethics, Information, and Technology" will be published in Summer 2025. Kip Currier, PhD, JD
Showing posts with label human biases in life and death decisions. Show all posts
Showing posts with label human biases in life and death decisions. Show all posts
Friday, January 25, 2019
A Study on Driverless-Car Ethics Offers a Troubling Look Into Our Values; The New Yorker, January 24, 2019
Subscribe to:
Posts (Atom)