Showing posts with label trade-offs. Show all posts
Showing posts with label trade-offs. Show all posts

Saturday, April 28, 2018

Data on a genealogy site led police to the ‘Golden State Killer’ suspect. Now others worry about a ‘treasure trove of data’; The Washington Post, April 27, 2018

Justin JouvenalMark BermanDrew Harwell and Tom Jackman, The Washington Post; Data on a genealogy site led police to the ‘Golden State Killer’ suspect. Now others worry about a ‘treasure trove of data’

"Prosecutors say they see the private genealogical databases as an investigative gold mine, and they worry that privacy concerns could block them from the breakthroughs needed to track down future predators.

“Why in God’s name would we come up with a reason that we not be able to use it, on the argument that it intrudes onto someone’s privacy?” said Josh Marquis of the National District Attorneys Association. “Everything’s a trade-off. Obviously we want to preserve privacy. But on the other hand, if we’re able to use this technology without exposing someone’s deepest, darkest secrets, while solving these really horrible crimes, I think it’s a valid trade-off.”

Some legal experts compared the use of public genetic databases to the way authorities can scan other personal data provided to third-party sources, including telephone companies and banks. Others suggested further scrutiny as the amount of publicly available DNA multiplies.

“The law often lags behind where technology has evolved,” said Barbara McQuade, a University of Michigan law professor and former U.S. attorney. With DNA, “most of us have the sense that that feels very private, very personal, and even if you have given it up to one of these third-party services, maybe there should be a higher level of security.”"

Monday, February 20, 2017

The big moral dilemma facing self-driving cars; Washington Post, February 20, 2017

Steven Overly, Washington Post; The big moral dilemma facing self-driving cars

"Researchers at the University of Pennsylvania have dubbed this “algorithm aversion.” In a 2014 study, participants were asked to observe a computer and a human make predictions about the future, such as how a student would perform based on past test scores. Researchers found that “people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake.”"

Friday, January 15, 2016

Americans don't trust companies with their data, but give it up anyway; FedScoop, 1/14/16

Greg Otto, FedScoop; Americans don't trust companies with their data, but give it up anyway:
"According to research unveiled Thursday, most Americans are willing to sacrifice their data privacy if they believe it will somehow benefit them; otherwise they are resigned to the fact they have no control over their personal information if they plan on being a consumer in the modern, Internet-connected, data-driven retail space.
The former conclusion is suggested by a study released by the Pew Research Center Thursday, which measured attitudes to privacy and surveillance among a nationally representative and statistically valid sample of adult Americans, and then explored the results in a series of focus groups.
A majority of respondents told Pew they have low levels of confidence in both business and government when it comes to data collection; but many were prepared to put their concerns aside if there was a trade-off — some benefit they got in return. Supermarket loyalty cards, for instance, track holders' shopping preferences in return for discounts on products — a trade-off seen as acceptable by 47 percent of survey respondents.
By contrast, research presented at the Federal Trade Commission’s PrivacyCon Thursday suggested that marketers are taking advantage of the widespread acceptance of certain kinds of trade-offs to collect massive amounts of data — and thereby inspiring cynicism and resignation among Americans about their privacy."