Showing posts with label Cathy O'Neil. Show all posts
Showing posts with label Cathy O'Neil. Show all posts

Sunday, March 12, 2017

That Health Tracker Could Cost You; Bloomberg, February 23, 2017

Cathy O'Neil, Bloomberg; 

That Health Tracker Could Cost You

"Say, for example, left-handed people with vegetarian diets prove more likely to require expensive medical treatments. Insurance companies might then start charging higher premiums to people with similar profiles -- that is, to those the algorithm has tagged as potentially costly. Granted, the Affordable Care Act currently prohibits such discrimination. But that could change if Donald Trump fulfills his promise to repeal the law.

Think about what that means for insurance...

If we're not careful, pretty soon it’ll be almost like there's no insurance at all."

Thursday, September 8, 2016

The case against big data: “It’s like you’re being put into a cult, but you don’t actually believe in it”; Salon, 9/8/16

Scott Timberg, Salon; The case against big data: “It’s like you’re being put into a cult, but you don’t actually believe in it” :
"If you’ve ever suspected there was something baleful about our deep trust in data, but lacked the mathematical skills to figure out exactly what it was, this is the book for you: Cathy O’Neil’s “Weapons of Math Destruction” examines college admissions, criminal justice, hiring, getting credit, and other major categories. The book demonstrates how the biases written into algorithms distort society and people’s lives...
Your book looks at how unjust this all is at the level of education, of voting, of finance, of housing. You conclude by saying that the data isn’t going away, and computers are not going to disappear either. There are not many examples of societies that unplugged or dialed back technologically. So what are you hoping can happen? What do we need to do as a society to, to make this more just, and less unfair and invisible?
Great point, because we now have algorithms that can retroactively infer people’s sexual identity based on their Facebook likes from, you know, 2005. We didn’t have it in 2005. So imagine the kind of data exhaust that we’re generating now could likely display weird health risks. The technology might not be here now but it might be here in five years.
The very first answer is that people need to stop trusting mathematics and they need to stop trusting black box algorithms. They need to start thinking to themselves. You know: Who owns this algorithm? What is their goal and is it aligned with mine? If they’re trying to profit off of me, probably the answer is no.
And then they should be able to demand some kind of consumer, or whatever, Bill of Rights for algorithms.
And that would be: Let me see my score, let me look at the data going into that score, let me contest incorrect data. Let me contest unfair data. You shouldn’t be able to use this data against me just because — going back to the criminal justice system — just because I was born in a high crime neighborhood doesn’t mean I should go to jail longer.
We have examples of rules like this . . . anti-discrimination laws, to various kinds of data privacy laws. They were written, typically, in the ’70s. They need to be updated. And expanded for the age of big data.
And then, so finally I want data scientists themselves to stop hiding behind this facade of objectivity. It’s just … it’s over. The game, the game is up."