"Anyone who’s followed the debates surrounding autonomous vehicles knows that moral quandaries inevitably arise. As Jesse Kirkpatrick has written in Slate, those questions most often come down to how the vehicles should perform when they’re about to crash. What do they do if they have to choose between killing a passenger and harming a pedestrian? How should they behave if they have to decide between slamming into a child or running over an elderly man? It’s hard to figure out how a car should make such decisions in part because it’s difficult to get humans to agree on how we should make them. By way of evidence, look to Moral Machine, a website created by a group of researchers at the MIT Media Lab. As the Verge’s Russell Brandon notes, the site effectively gameifies the classic trolley problem, folding in a variety of complicated variations along the way."
Ethically-tangled aspects of 21st century societies and cultures. In the vein of Charles Darwin’s 1859 “entangled bank” metaphor—a complex and evolving digital ecosystem of difference and dependence, where humans, technologies, ethics, law, policy, data, and information converge and diverge. Kip Currier, PhD, JD
Friday, August 12, 2016
Should a Self-Driving Car Kill Two Jaywalkers or One Law-Abiding Citizen?; Slate, 8/11/16
Jacob Brogan, Slate; Should a Self-Driving Car Kill Two Jaywalkers or One Law-Abiding Citizen? :
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.