Showing posts with label data collection and use. Show all posts
Showing posts with label data collection and use. Show all posts

Tuesday, January 22, 2019

The AI Arms Race Means We Need AI Ethics; Forbes, January 22, 2019

Kasia Borowska, Forbes; The AI Arms Race Means We Need AI Ethics

"In an AI world, the currency is data. Consumers and citizens trade data for convenience and cheaper services. The likes of Facebook, Google, Amazon, Netflix and others process this data to make decisions that influence likes, the adverts we see, purchasing decisions or even who we vote for. There are questions to ask on the implications of everything we access, view or read being controlled by a few global elite. There are also major implications if small companies or emerging markets are unable to compete from being priced out of the data pool. This is why access to AI is so important: not only does it enable more positives from AI to come to the fore, but it also helps to prevent monopolies forming. Despite industry-led efforts, there are no internationally agreed ethical rules to regulate the AI market."

Sunday, January 6, 2019

Our privacy regime is broken. Congress needs to create new norms for a digital age.; The Washington Post, January 5, 2019

Saturday, January 5, 2019

How to protect your digital privacy from new Christmas presents; The Guardian, December 18, 2019

Alex Hern, The Guardian; How to protect your digital privacy from new Christmas presents

"Here are the best tips to protect your digital privacy, without resorting to Christmas gifts whittled from wood."

'Tracking every place you go': Weather Channel app accused of selling user data; Associated Press via The Guardian, January 4, 2019

Associated Press via The Guardian; 'Tracking every place you go': Weather Channel app accused of selling user data

"“Think how Orwellian it feels to live in a world where a private company is tracking potentially every place you go, every minute of every day,” Feuer said. “If you want to sacrifice to that company that information, you sure ought to be doing it with clear advanced notice of what’s at stake.”

A spokesman for IBM, which owns the app, said it had always been clear about the use of location data collected from users and will vigorously defend its “fully appropriate” disclosures."

Friday, December 21, 2018

What are tech companies doing about ethical use of data? Not much; The Conversation, November 27, 2018

, The Conversation; What are tech companies doing about ethical use of data? Not much

"Our relationship with tech companies has changed significantly over the past 18 months. Ongoing data breaches, and the revelations surrounding the Cambridge Analytica scandal, have raised concerns about who owns our data, and how it is being used and shared.

Tech companies have vowed to do better. Following his grilling by both the US Congress and the EU Parliament, Facebook CEO, Mark Zuckerberg, said Facebook will change the way it shares data with third party suppliers. There is some evidence that this is occurring, particularly with advertisers.

But have tech companies really changed their ways? After all, data is now a primary asset in the modern economy.

To find whether there’s been a significant realignment between community expectations and corporate behaviour, we analysed the data ethics principles and initiatives that various global organisations have committed since the various scandals broke.

What we found is concerning. Some of the largest organisations have not demonstrably altered practices, instead signing up to ethics initiatives that are neither enforced nor enforceable."

Facebook: A Case Study in Ethics ; CMS Wire, December 20, 2018

Laurence Hart, CMS Wire; Facebook: A Case Study in Ethics 

"It feels like every week, a news item emerges that could serve as a case study in ethics. A company's poor decision when exposed to the light of day (provided by the press) seems shockingly bad. The ethical choice in most cases should have been obvious, but it clearly wasn’t the one made.

This week, as in many weeks in 2018, the case study comes from Facebook."

Thursday, December 20, 2018

Facebook Didn’t Sell Your Data; It Gave It Away In exchange for even more data about you from Amazon, Netflix, Spotify, Microsoft, and others; The Atlantic, December 19, 2018

Alexis C. Madrigal, The Atlantic;

Facebook Didn’t Sell Your Data; It Gave It Away


"By the looks of it, other tech players have been happy to let Facebook get beaten up while their practices went unexamined. And then, in this one story, the radioactivity of Facebook’s data hoard spread basically across the industry. There is a data-industrial complex, and this is what it looked like."

How You Can Help Fight the Information Wars: Silicon Valley won’t save us. We’re on our own.; The New York Times, December 18, 2018

Kara Swisher, The New York Times;

How You Can Help Fight the Information Wars:

Silicon Valley won’t save us. We’re on our own.

[Kip Currier: A rallying cry to all persons passionate about and/or working on issues related to information literacy and evaluating information...]

"For now, it’s not clear what we can do, except take control of our own individual news consumption. Back in July, in fact, Ms. DiResta advised consumer restraint as the first line of defense, especially when encountering information that any passably intelligent person could guess might have been placed by a group seeking to manufacture discord.

“They’re preying on your confirmation bias,” she said. “When content is being pushed to you, that’s something that you want to see. So, take the extra second to do the fact-check, even if it confirms your worst impulses about something you absolutely hate — before you hit the retweet button, before you hit the share button, just take the extra second.”

If we really are on our own in this age of information warfare, as the Senate reports suggest, there’s only one rule that can help us win it: See something, say nothing."

Facebook’s Data Sharing and Privacy Rules: 5 Takeaways From Our Investigation; The New York Times, December 18, 2018

Nicholas Confessore, Michael LaForgia and Gabriel J.X. Dance, The New York Times;

Facebook’s Data Sharing and Privacy Rules: 5 Takeaways From Our Investigation

 

"You are the product: That is the deal many Silicon Valley companies offer to consumers. The users get free search engines, social media accounts and smartphone apps, and the companies use the personal data they collect — your searches, “likes,” phone numbers and friends — to target and sell advertising.

But an investigation by The New York Times, based on hundreds of pages of internal Facebook documents and interviews with about 50 former employees of Facebook and its partners, reveals that the marketplace for that data is even bigger than many consumers suspected. And Facebook, which collects more information on more people than almost any other private corporation in history, is a central player.

Here are five takeaways from our investigation."

As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants; The New York Times, December 18, 2018

Gabriel J.X. Dance, Michael LaForgia and Nicholas Confessore, The New York Times; As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants

"For years, Facebook gave some of the world’s largest technology companies more intrusive access to users’ personal data than it has disclosed, effectively exempting those business partners from its usual privacy rules, according to internal records and interviews.

The special arrangements are detailed in hundreds of pages of Facebook documents obtained by The New York Times. The records, generated in 2017 by the company’s internal system for tracking partnerships, provide the most complete picture yet of the social network’s data-sharing practices. They also underscore how personal data has become the most prized commodity of the digital age, traded on a vast scale by some of the most powerful companies in Silicon Valley and beyond."

Sunday, December 2, 2018

I Wanted to Stream Buffy, Angel, and Firefly for Free, But Not Like This; Gizmodo, November 30, 2018

Alex Cranz, Gizmodo; I Wanted to Stream Buffy, Angel, and Firefly for Free, But Not Like This

"This is TV that should be accessible to everyone, but Facebook Watch? Really? In order to watch Buffy take on a demon with a rocket launcher you have to be willing to sit there and stare at a video on the Facebook platform—the same place your cousin continues to post Daily Caller Trump videos and that friend from high school shares clips of a Tasty casserole made of butter, four tubes of biscuit dough, baked beans, and a hot dog? The price for complimentary access to three of the best shows produced is bargaining away your data and privacy?

No, thanks.

But Facebook is hoping we’ll all say yes, please. Facebook’s user growth in the U.S. notably hit a wall over the summer and it’s been trying to fix things. It’s also trying to make itself more “sticky,” so people stay on Facebook to get not just family and friend updates and memes, but also the streams and standard videos more commonly found on YouTube. Last year Facebook launched Watch, its YouTube competitor that was, from the start, filled with trash. But things have slowly improved, with the show Sorry for Your Loss gaining rave reviews."

Sunday, November 18, 2018

To regulate AI we need new laws, not just a code of ethics; The Guardian, October 28, 2018

Paul Chadwick, The Guardian; To regulate AI we need new laws, not just a code of ethics

"For a sense of Facebook’s possible future EU operating environment, Zuckerberg should read the Royal Society’s new publication about the ethical and legal challenges of governing artificial intelligence. One contribution is by a senior European commission official, Paul Nemitz, principal adviser, one of the architects of the EU’s far-reaching General Data Protection Regulation, which took effect in May this year.

Nemitz makes clear the views are his own and not necessarily those of the European commission, but the big tech companies might reasonably see his article, entitled “Constitutional democracy and technology in the age of artificial intelligence”, as a declaration of intent.

“We need a new culture of technology and business development for the age of AI which we call ‘rule of law, democracy and human rights by design’,” Nemitz writes. These core ideas should be baked into AI, because we are entering “a world in which technologies like AI become all pervasive and are actually incorporating and executing the rules according to which we live in large part”.

To Nemitz, “the absence of such framing for the internet economy has already led to a widespread culture of disregard of the law and put democracy in danger, the Facebook Cambridge Analytica scandal being only the latest wake-up call”."

Thursday, October 25, 2018

Apple’s Tim Cook blasts Silicon Valley over privacy issues; The Washington Post, October 24, 2018

Tony Romm, The Washington Post; Apple’s Tim Cook blasts Silicon Valley over privacy issues

"Apple chief executive Tim Cook on Wednesday warned the world’s most powerful regulators that the poor privacy practices of some tech companies, the ills of social media and the erosion of trust in his own industry threaten to undermine “technology’s awesome potential” to address challenges such as disease and climate change."

Wednesday, October 3, 2018

Why you need a code of ethics (and how to build one that sticks); CIO, September 17, 2018

Josh Fruhlinger, CIO; Why you need a code of ethics (and how to build one that sticks)

"Importance of a code of ethics

Most of us probably think of ourselves as ethical people. But within organizations built to maximize profits, many seemingly inevitably drift towards more dubious behavior, especially when it comes to user personal data. "More companies than not are collecting data just for the sake of collecting data, without having any reason as to why or what to do with it," says Philip Jones, a GDPR regulatory compliance expert at Capgemini. "Although this is an expensive and unethical approach, most businesses don’t think twice about it. I view this approach as one of the highest risks to companies today, because they have no clue where, how long, or how accurate much of their private data is on consumers."

This is the sort of organizational ethical drift that can arise in the absence of clear ethical guidelines—and it's the sort of drift that laws like the GDPR, the EU's stringent new framework for how companies must handle customer data, are meant to counter."

Thursday, September 6, 2018

From Mountain of CCTV Footage, Pay Dirt: 2 Russians Are Named in Spy Poisoning; The New York Times, September 5, 2018

Ellen Barry, The New York Times;

From Mountain of CCTV Footage, Pay Dirt: 2 Russians Are Named in Spy Poisoning


[Kip Currier: Fascinating example of good old-fashioned, "methodical, plodding" detective work, combined with 21st century technologies of mass surveillance and facial recognition by machines and gifted humans.

As I think about the chapters on privacy and surveillance in the ethics textbook I'm writing, this story is a good reminder of the socially-positive aspects of new technologies, amid often legitimate concerns about their demonstrated and potential downsides. In the vein of prior stories I've posted on this blog about the use, for example, of drones for animal conservation and monitoring efforts, the identification of the two Russian operatives in the Salisbury, UK poisoning case highlights how the uses and applications of digital age technologies like mass surveillance frequently fall outside the lines of "all bad" or "all good".]

"“It’s almost impossible in this country to hide, almost impossible,” said John Bayliss, who retired from the Government Communications Headquarters, Britain’s electronic intelligence agency, in 2010. “And with the new software they have, you can tell the person by the way they walk, or a ring they wear, or a watch they wear. It becomes even harder.”

The investigation into the Skripal poisoning, known as Operation Wedana, will stand as a high-profile test of an investigative technique Britain has pioneered: accumulating mounds of visual data and sifting through it...

Ceri Hurford-Jones, the managing director of Salisbury’s local radio station, saluted investigators for their “sheer skill in getting a grip on this, and finding out who these people were.”

It may not have been the stuff of action films, but Mr. Hurford-Jones did see something impressive about the whole thing.

“It’s methodical, plodding,” he said. “But, you know, that’s the only way you can do these things. There is a bit of Englishness in it.”"

Sunday, August 5, 2018

Interview: Yuval Noah Harari: ‘The idea of free information is extremely dangerous’; The Guardian, August 5, 2018

Andrew Anthony, The Guardian; Interview: Yuval Noah Harari: ‘The idea of free information is extremely dangerous’

"Why is liberalism under particular threat from big data?
Liberalism is based on the assumption that you have privileged access to your own inner world of feelings and thoughts and choices, and nobody outside you can really understand you. This is why your feelings are the highest authority in your life and also in politics and economics – the voter knows best, the customer is always right. Even though neuroscience shows us that there is no such thing as free will, in practical terms it made sense because nobody could understand and manipulate your innermost feelings. But now the merger of biotech and infotech in neuroscience and the ability to gather enormous amounts of data on each individual and process them effectively means we are very close to the point where an external system can understand your feelings better than you. We’ve already seen a glimpse of it in the last epidemic of fake news.

There’s always been fake news but what’s different this time is that you can tailor the story to particular individuals, because you know the prejudice of this particular individual. The more people believe in free will, that their feelings represent some mystical spiritual capacity, the easier it is to manipulate them, because they won’t think that their feelings are being produced and manipulated by some external system...

You say if you want good information, pay good money for it. The Silicon Valley adage is information wants to be free, and to some extent the online newspaper industry has followed that. Is that wise?
The idea of free information is extremely dangerous when it comes to the news industry. If there’s so much free information out there, how do you get people’s attention? This becomes the real commodity. At present there is an incentive in order to get your attention – and then sell it to advertisers and politicians and so forth – to create more and more sensational stories, irrespective of truth or relevance. Some of the fake news comes from manipulation by Russian hackers but much of it is simply because of the wrong incentive structure. There is no penalty for creating a sensational story that is not true. We’re willing to pay for high quality food and clothes and cars, so why not high quality information?"

Tuesday, July 31, 2018

Big Data Is Getting Bigger. So Are the Privacy and Ethical Questions.; The Chronicle of Higher Education, July 31, 2018

Goldie Blumenstyk, The Chronicle of Higher Education; Big Data Is Getting Bigger. So Are the Privacy and Ethical Questions.

"Big data is getting bigger. So are the privacy and ethical questions.

The next step in using “big data” for student success is upon us. It’s a little cool. And also kind of creepy.

This new approach goes beyond the tactics now used by hundreds of colleges, which depend on data collected from sources like classroom teaching platforms and student-information systems. It not only makes a technological leap; it also raises issues around ethics and privacy.

Here’s how it works: Whenever you log on to a wireless network with your cellphone or computer, you leave a digital footprint. Move from one building to another while staying on the same network, and that network knows how long you stayed and where you went. That data is collected continuously and automatically from the network’s various nodes.

Now, with the help of a company called Degree Analytics, a few colleges are beginning to use location data collected from students’ cellphones and laptops as they move around campus. Some colleges are using it to improve the kind of advice they might send to students, like a text-message reminder to go to class if they’ve been absent."

Thursday, July 19, 2018

“I Was Devastated”: Tim Berners-Lee, the Man Who Created the World Wide Web, Has Some Regrets; Vanity Fair, July 1, 2018

Katrina Brooker, Vanity Fair; “I Was Devastated”: Tim Berners-Lee, the Man Who Created the World Wide Web, Has Some Regrets


"For now, chastened by bad press and public outrage, tech behemoths and other corporations say they are willing to make changes to ensure privacy and protect their users. “I’m committed to getting this right,” Facebook’s Zuckerberg told Congress in April. Google recently rolled out new privacy features to Gmail which would allow users to control how their messages get forwarded, copied, downloaded, or printed. And as revelations of spying, manipulation, and other abuses emerge, more governments are pushing for change. Last year the European Union fined Google $2.7 billion for manipulating online shopping markets. This year new regulations will require it and other tech companies to ask for users’ consent for their data. In the U.S., Congress and regulators are mulling ways to check the powers of Facebook and others.

But laws written now don’t anticipate future technologies. Nor do lawmakers—many badgered by corporate lobbyists—always choose to protect individual rights. In December, lobbyists for telecom companies pushed the Federal Communications Commission to roll back net-neutrality rules, which protect equal access to the Internet. In January, the U.S. Senate voted to advance a bill that would allow the National Security Agency to continue its mass online-surveillance program. Google’s lobbyists are now working to modify rules on how companies can gather and store biometric data, such as fingerprints, iris scans, and facial-recognition images."

Friday, June 29, 2018

California Just Passed the Strictest Online Privacy Bill in the Country; Slate, June 28, 2018

April Glaser, Slate; California Just Passed the Strictest Online Privacy Bill in the Country

"alifornia passed one of the toughest online privacy bills in the country Thursday, despite lobbying by Facebook, Google, Microsoft, Amazon, AT&T, and others, who poured money into an industry-aligned group that tried to defeat the measure.

If signed by Gov. Jerry Brown, the bill, the California Consumer Privacy Act of 2018, will require technology companies that collect user information to disclose the type of data they collect, details on the advertisers or other third parties with which they share data, and allow customers to opt out of having the data collected about them sold. The new bill also gives customers the option to request companies delete personal information collected on them—like data on how many kids a person has, their buying habits, location information, or other non-publicly available data. Companies that do peddle user data have to offer the new privacy options for free and won’t be allowed to degrade service if a customer opts to no longer have their data sold."