Siva Vaidhyanathan, The Guardian; Facebook's privacy meltdown after Cambridge Analytica is far from over
"Facebook might not be run by Bond villains. But it’s run by people
who have little knowledge of or concern for democracy or the dignity of
the company’s 2.3 billion users.
The privacy meltdown story should be about how one wealthy and
powerful company gave our data without our permission to hundreds of
companies with no transparency, oversight, or even concern about abuse.
Fortunately, the story does not end with Cambridge Analytica. The United States government revealed on Wednesday that it had opened a criminal investigation into Facebook over just these practices."
Issues and developments related to ethics, information, and technologies, examined in the ethics and intellectual property graduate courses I teach at the University of Pittsburgh School of Computing and Information. My Bloomsbury book "Ethics, Information, and Technology" will be published in Summer 2025. Kip Currier, PhD, JD
Showing posts with label Cambridge Analytica. Show all posts
Showing posts with label Cambridge Analytica. Show all posts
Tuesday, March 19, 2019
Facebook's privacy meltdown after Cambridge Analytica is far from over; The Guardian, March 18, 2019
Tuesday, April 24, 2018
Cambridge University rejected Facebook study over 'deceptive' privacy standards; The Guardian, April 24, 2018
Matthew Weaver, The Guardian; Cambridge University rejected Facebook study over 'deceptive' privacy standards
"Exclusive: panel told researcher Aleksandr Kogan that Facebook’s approach fell ‘far below ethical expectations’
A Cambridge University ethics panel rejected research by the academic at the centre of the Facebook data harvesting scandal over the social network’s “deceptive” approach to its users privacy, newly released documents reveal."
"Exclusive: panel told researcher Aleksandr Kogan that Facebook’s approach fell ‘far below ethical expectations’
A Cambridge University ethics panel rejected research by the academic at the centre of the Facebook data harvesting scandal over the social network’s “deceptive” approach to its users privacy, newly released documents reveal."
Thursday, April 12, 2018
After Cambridge Analytica, Privacy Experts Get to Say ‘I Told You So’; April 12, 2018
Nellie Bowles, The New York Times; After Cambridge Analytica, Privacy Experts Get to Say ‘I Told You So’
"In their own lives, privacy experts are now fielding a spike in calls from their relatives asking them for advice about protecting their personal data. Engineers are discussing new privacy projects with them. Even teenagers are paying attention to what they have to say.
For many of the developers, this is the right time to push ahead with testing more privacy solutions, including more advanced advertising blockers, peer-to-peer browsers that decentralize the internet, new encryption techniques, and data unions that let users pool their data and sell it themselves. Others want to treat tech giants more as information fiduciaries, which have a legal responsibility to protect user data.
And for the first time, many privacy experts think internet users will be more willing to put up with a little more inconvenience in return for a lot more privacy.
“This is the first blink of awakening of the world to a danger that’s been present for a long time, which is that we are exposed,” Mr. Searls said. “Cambridge Analytica is old, old news to privacy folks. They’re just the tip of the crapberg.”"
"In their own lives, privacy experts are now fielding a spike in calls from their relatives asking them for advice about protecting their personal data. Engineers are discussing new privacy projects with them. Even teenagers are paying attention to what they have to say.
For many of the developers, this is the right time to push ahead with testing more privacy solutions, including more advanced advertising blockers, peer-to-peer browsers that decentralize the internet, new encryption techniques, and data unions that let users pool their data and sell it themselves. Others want to treat tech giants more as information fiduciaries, which have a legal responsibility to protect user data.
And for the first time, many privacy experts think internet users will be more willing to put up with a little more inconvenience in return for a lot more privacy.
“This is the first blink of awakening of the world to a danger that’s been present for a long time, which is that we are exposed,” Mr. Searls said. “Cambridge Analytica is old, old news to privacy folks. They’re just the tip of the crapberg.”"
Thursday, April 5, 2018
Sorry, Facebook was never ‘free’; The New York Post, March 21, 2018
John Podhoretz, The New York Post; Sorry, Facebook was never ‘free’
[Kip Currier: On today's MSNBC Morning Joe show, The New York Post's
John Podhoretz pontificated on the same provocative assertions that he wrote about in
his March
21, 2018 opinion piece, excerpted below. It’s a post-Cambridge
Analytica “Open Letter polemic” directed at anyone (--or using Podhoretz’s term,
any fool) who signed up for Facebook “back
in the day” and who may now be concerned about how free social media sites like
Facebook use—as well as how Facebook et
al enable third parties to “harvest”, “scrape”, and leverage—people’s personal data.
Podhoretz’s argument
is flawed on so many levels it’s challenging to know where to begin. (Full
disclosure: As someone working in academia in a computing and information
science school, who signed up for Facebook some years ago to see what all the “fuss”
was about, I’ve never used my Facebook account because of ongoing privacy
concerns about it. Much to the chagrin of some family and friends who have
exhorted me, unsuccessfully, to use it.)
Certainly, there is
some level of “ownership” that each of us needs to take when we sign up for a
social media site or app by clicking on the Terms and Conditions and/or End
User License Agreement (EULA). But it’s also common knowledge now (ridiculed by
self-aware super-speed-talking advertisers in TV and radio ads!) that these agreements
are written in legalese that don’t fully
convey the scope and potential scope
of the ramifications of these agreements’ terms and conditions. (Aside: For a clever
satirical take on the purposeful impenetrability and abstruseness of these lawyer-crafted
agreements, see R. Sikoryak’s 2017 graphic novel Terms
and Conditions, which visually lampoons an Apple iTunes user contract.)
Over the course of
decades, for example, in the wake of the Tuskegee Syphilis experiments and
other medical research abuses and controversies, medical research practitioners
were legally coerced to come to terms with the fact that laws, ethics, and
policies about “informed consent” needed to evolve to better inform and protect “human
subjects” (translation: you and me).
A similar argument
can be made regarding Facebook and its social media kin: namely, that tech
companies and app developers need to voluntarily adopt (or be required to
adopt) HIPAA-esque protections and promote more “informed” consumer awareness.
We also need more computer
science ethics training and education for undergraduates, as well as more
widespread digital citizenship education in K-12 settings, to ensure a level
playing field of digital life awareness.
(Hint, hint, Education Secretary Betsy DeVos or First Lady Melania Trump…here’s a mission critical for your patronage.)
Podhoretz’s
simplistic Facebook
user-as-deplorable-fool rant puts all
of the blame on users, while negating any
responsibility for bait-and-switch tech companies like Facebook and data-sticky-fingered
accomplices like Cambridge Analytica. “Free” doesn’t mean tech companies and
app designers should be free from enhanced and reasonable informed consent responsibilities they owe to their
users. Expecting or allowing anything less would be foolish.]
"The science fiction writer Robert A. Heinlein said it best: “There ain’t no such thing as a free lunch.” Everything has a cost. If you forgot that, or refused to see it in your relationship with Facebook, or believe any of these things, sorry, you are a fool. So the politicians and pundits who are working to soak your outrage for their own ideological purposes are gulling you. But of course you knew.
You just didn’t care . . . until you cared. Until, that is, you decided this was a convenient way of explaining away the victory of Donald Trump in the 2016 election.
You’re so invested in the idea that Trump stole the election, you are willing to believe anything other than that your candidate lost because she made a lousy argument and ran a lousy campaign and didn’t know how to run a race that would put her over the top in the Electoral College — which is how you prevail in a presidential election and has been for 220-plus years.
The rage and anger against Facebook over the past week provide just the latest examples of the self-infantilization and flight from responsibility on the part of the American people and the refusal of Trump haters and American liberals to accept the results of 2016.
Honestly, it’s time to stop being fools and start owning up to our role in all this."
Wednesday, March 28, 2018
Apple CEO Tim Cook slams Facebook: Privacy 'is a human right, it's a civil liberty'; NBC, March 28, 2018
Elizabeth Chuck and Chelsea Bailey, NBC; Apple CEO Tim Cook slams Facebook: Privacy 'is a human right, it's a civil liberty'
"Privacy to us is a human right. It's a civil liberty, and something that is unique to America. This is like freedom of speech and freedom of the press," Cook said. "Privacy is right up there with that for us."
His comments are consistent with Apple's long-held privacy stance — which the company stood by even in the face of a legal quarrel with the U.S. government a couple of years ago, when it refused to help the FBI unlock an iPhone belonging to the man responsible for killing 14 people in San Bernadino, California, in December 2015."
"Privacy to us is a human right. It's a civil liberty, and something that is unique to America. This is like freedom of speech and freedom of the press," Cook said. "Privacy is right up there with that for us."
His comments are consistent with Apple's long-held privacy stance — which the company stood by even in the face of a legal quarrel with the U.S. government a couple of years ago, when it refused to help the FBI unlock an iPhone belonging to the man responsible for killing 14 people in San Bernadino, California, in December 2015."
Cambridge Analytica controversy must spur researchers to update data ethics; Nature, March 27, 2018
Editorial, Nature; Cambridge Analytica controversy must spur researchers to update data ethics
"Ethics training on research should be extended to computer scientists who have not conventionally worked with human study participants.
Academics across many fields know well how technology can outpace its regulation. All researchers have a duty to consider the ethics of their work beyond the strict limits of law or today’s regulations. If they don’t, they will face serious and continued loss of public trust."
"Ethics training on research should be extended to computer scientists who have not conventionally worked with human study participants.
Academics across many fields know well how technology can outpace its regulation. All researchers have a duty to consider the ethics of their work beyond the strict limits of law or today’s regulations. If they don’t, they will face serious and continued loss of public trust."
Saturday, March 24, 2018
‘A grand illusion’: seven days that shattered Facebook’s facade; Guardian, March 24, 2018
Olivia Solon, Guardian; ‘A grand illusion’: seven days that shattered Facebook’s facade
"For too long consumers have thought about privacy on Facebook in terms of whether their ex-boyfriends or bosses could see their photos. However, as we fiddle around with our profile privacy settings, the real intrusions have been taking place elsewhere.
“In this sense, Facebook’s ‘privacy settings’ are a grand illusion. Control over post-sharing – people we share to – should really be called ‘publicity settings’,” explains Jonathan Albright, the research director at the Tow Center for Digital Journalism. “Likewise, control over passive sharing – the information people [including third party apps] can take from us – should be called ‘privacy settings’.”
Essentially Facebook gives us privacy “busywork” to make us think we have control, while making it very difficult to truly lock down our accounts."
"For too long consumers have thought about privacy on Facebook in terms of whether their ex-boyfriends or bosses could see their photos. However, as we fiddle around with our profile privacy settings, the real intrusions have been taking place elsewhere.
“In this sense, Facebook’s ‘privacy settings’ are a grand illusion. Control over post-sharing – people we share to – should really be called ‘publicity settings’,” explains Jonathan Albright, the research director at the Tow Center for Digital Journalism. “Likewise, control over passive sharing – the information people [including third party apps] can take from us – should be called ‘privacy settings’.”
Essentially Facebook gives us privacy “busywork” to make us think we have control, while making it very difficult to truly lock down our accounts."
Computer science faces an ethics crisis. The Cambridge Analytica scandal proves it.; Boston Globe, March 22, 2018
Yonatan Zunger, Boston Globe;
Computer science faces an ethics crisis. The Cambridge Analytica scandal proves it.
"Software engineers continue to treat safety and ethics as specialities, rather than the foundations of all design; young engineers believe they just need to learn to code, change the world, disrupt something. Business leaders focus on getting a product out fast, confident that they will not be held to account if that product fails catastrophically. Simultaneously imagining their products as changing the world and not being important enough to require safety precautions, they behave like kids in a shop full of loaded AK-47’s...
Underpinning all of these need to be systems for deciding on what computer science ethics should be, and how they should be enforced. These will need to be built by a consensus among the stakeholders in the field, from industry, to academia, to capital, and most importantly, among the engineers and the public, who are ultimately most affected. It must be done with particular attention to diversity of representation. In computer science, more than any other field, system failures tend to affect people in different social contexts (race, gender, class, geography, disability) differently. Familiarity with the details of real life in these different contexts is required to prevent disaster...
What stands between these is attention to the core questions of engineering: to what uses might a system be put? How might it fail? And how will it behave when it does? Computer science must step up to the bar set by its sister fields, before its own bridge collapse — or worse, its own Hiroshima."
Thursday, March 22, 2018
It’s Time to Regulate the Internet; The Atlantic, March 21, 2018
Franklin Foer, The Atlantic; It’s Time to Regulate the Internet
"If we step back, we can see it clearly: Facebook’s business model is the evisceration of privacy. That is, it aims to adduce its users into sharing personal information—what the company has called “radical transparency”—and then aims to surveil users to generate the insights that will keep them “engaged” on its site and to precisely target them with ads. Although Mark Zuckerberg will nod in the direction of privacy, he has been candid about his true feelings. In 2010 he said, for instance, that privacy is no longer a “social norm.” (Once upon a time, in a fit of juvenile triumphalism, he even called people “dumb fucks” for trusting him with their data.) And executives in the company seem to understand the consequence of their apparatus. When I recently sat on a panel with a representative of Facebook, he admitted that he hadn’t used the site for years because he was concerned with protecting himself against invasive forces.
We need to constantly recall this ideological indifference to privacy, because there should be nothing shocking about the carelessness revealed in the Cambridge Analytica episode...
Facebook turned data—which amounts to an X-ray of the inner self—into a commodity traded without our knowledge."
"If we step back, we can see it clearly: Facebook’s business model is the evisceration of privacy. That is, it aims to adduce its users into sharing personal information—what the company has called “radical transparency”—and then aims to surveil users to generate the insights that will keep them “engaged” on its site and to precisely target them with ads. Although Mark Zuckerberg will nod in the direction of privacy, he has been candid about his true feelings. In 2010 he said, for instance, that privacy is no longer a “social norm.” (Once upon a time, in a fit of juvenile triumphalism, he even called people “dumb fucks” for trusting him with their data.) And executives in the company seem to understand the consequence of their apparatus. When I recently sat on a panel with a representative of Facebook, he admitted that he hadn’t used the site for years because he was concerned with protecting himself against invasive forces.
We need to constantly recall this ideological indifference to privacy, because there should be nothing shocking about the carelessness revealed in the Cambridge Analytica episode...
Facebook turned data—which amounts to an X-ray of the inner self—into a commodity traded without our knowledge."
Monday, March 19, 2018
Where's Zuck? Facebook CEO silent as data harvesting scandal unfolds; Guardian, March 19, 2018
Julia Carrie Wong, Guardian; Where's Zuck? Facebook CEO silent as data harvesting scandal unfolds
Regarding Facebook's handling of the revelations to date:
"The chief executive of Facebook, Mark Zuckerberg, has remained silent over the more than 48 hours since the Observer revealed the harvesting of 50 million users’ personal data, even as his company is buffeted by mounting calls for investigation and regulation, falling stock prices, and a social media campaign to #DeleteFacebook...
Also on Monday, the New York Times reported that Facebook’s chief security officer, Alex Stamos, would be leaving the company following disagreements with other executives over the handling of the investigation into the Russian influence operation...
Stamos is one of a small handful of Facebook executives who addressed the data harvesting scandal on Twitter over the weekend while Zuckerberg and Facebook’s chief operating officer, Shery Sandberg, said nothing."
[Kip Currier:
Scott Galloway, clinical professor of
marketing at the New York University Stern School of Business, made some strong
statements about the Facebook/Cambridge Analytica data harvesting scandal on
MSNBC's Stephanie Ruhle show yesterday.
Regarding Facebook's handling of the revelations to date:
"This
is a textbook example of how not to handle a crisis."
He
referred to Facebook's leadership as "tone-deaf management" that
initially denied a breach had occurred, and then subsequently deleted Tweets
saying that it was wrong to call what had occurred a breach.
Galloway also said
that "Facebook has embraced celebrity but refused to embrace its
responsibilities". He contrasted Facebook's ineffectual current crisis management to how Johnson
& Johnson demonstrated decisive leadership and accountability during the
"tampered Tylenol bottles" crisis the latter faced in the 1980's.]
"The chief executive of Facebook, Mark Zuckerberg, has remained silent over the more than 48 hours since the Observer revealed the harvesting of 50 million users’ personal data, even as his company is buffeted by mounting calls for investigation and regulation, falling stock prices, and a social media campaign to #DeleteFacebook...
Also on Monday, the New York Times reported that Facebook’s chief security officer, Alex Stamos, would be leaving the company following disagreements with other executives over the handling of the investigation into the Russian influence operation...
Stamos is one of a small handful of Facebook executives who addressed the data harvesting scandal on Twitter over the weekend while Zuckerberg and Facebook’s chief operating officer, Shery Sandberg, said nothing."
Data scandal is huge blow for Facebook – and efforts to study its impact on society; Guardian, March 18, 2018
Olivia Solon, Guardian; Data scandal is huge blow for Facebook – and efforts to study its impact on society
"The revelation that 50 million people had their Facebook profiles harvested so Cambridge Analytica could target them with political ads is a huge blow to the social network that raises questions about its approach to data protection and disclosure.
"The revelation that 50 million people had their Facebook profiles harvested so Cambridge Analytica could target them with political ads is a huge blow to the social network that raises questions about its approach to data protection and disclosure.
As Facebook executives wrangle on Twitter over the semantics of whether this constitutes a “breach”, the result for users is the same: personal data extracted from the platform and used for a purpose to which they did not consent.
Facebook has a complicated track record on privacy. Its business model is built on gathering data. It knows your real name, who your friends are, your likes and interests, where you have been, what websites you have visited, what you look like and how you speak."
Subscribe to:
Posts (Atom)