"If you’ve ever suspected there was something baleful about our deep trust in data, but lacked the mathematical skills to figure out exactly what it was, this is the book for you: Cathy O’Neil’s “Weapons of Math Destruction” examines college admissions, criminal justice, hiring, getting credit, and other major categories. The book demonstrates how the biases written into algorithms distort society and people’s lives... Your book looks at how unjust this all is at the level of education, of voting, of finance, of housing. You conclude by saying that the data isn’t going away, and computers are not going to disappear either. There are not many examples of societies that unplugged or dialed back technologically. So what are you hoping can happen? What do we need to do as a society to, to make this more just, and less unfair and invisible? Great point, because we now have algorithms that can retroactively infer people’s sexual identity based on their Facebook likes from, you know, 2005. We didn’t have it in 2005. So imagine the kind of data exhaust that we’re generating now could likely display weird health risks. The technology might not be here now but it might be here in five years. The very first answer is that people need to stop trusting mathematics and they need to stop trusting black box algorithms. They need to start thinking to themselves. You know: Who owns this algorithm? What is their goal and is it aligned with mine? If they’re trying to profit off of me, probably the answer is no. And then they should be able to demand some kind of consumer, or whatever, Bill of Rights for algorithms. And that would be: Let me see my score, let me look at the data going into that score, let me contest incorrect data. Let me contest unfair data. You shouldn’t be able to use this data against me just because — going back to the criminal justice system — just because I was born in a high crime neighborhood doesn’t mean I should go to jail longer. We have examples of rules like this . . . anti-discrimination laws, to various kinds of data privacy laws. They were written, typically, in the ’70s. They need to be updated. And expanded for the age of big data. And then, so finally I want data scientists themselves to stop hiding behind this facade of objectivity. It’s just … it’s over. The game, the game is up."
Issues and developments related to ethics, information, and technologies, examined in the ethics and intellectual property graduate courses I teach at the University of Pittsburgh School of Computing and Information. My Bloomsbury book "Ethics, Information, and Technology" will be published in Summer 2025. Kip Currier, PhD, JD
Showing posts with label call for "Bill of Rights for algorithms". Show all posts
Showing posts with label call for "Bill of Rights for algorithms". Show all posts
Thursday, September 8, 2016
The case against big data: “It’s like you’re being put into a cult, but you don’t actually believe in it”; Salon, 9/8/16
Scott Timberg, Salon; The case against big data: “It’s like you’re being put into a cult, but you don’t actually believe in it” :
Subscribe to:
Posts (Atom)