Showing posts with label users. Show all posts
Showing posts with label users. Show all posts

Wednesday, April 4, 2018

Grindr Sets Off Privacy Firestorm After Sharing Users’ H.I.V.-Status Data; The New York Times, April 3, 2018

Natasha Singer, The New York Times; Grindr Sets Off Privacy Firestorm After Sharing Users’ H.I.V.-Status Data

"Grindr, the social network aimed at gay, bisexual and transgender men, is facing a firestorm of criticism for sharing users’ H.I.V. status, sexual tastes and other intimate personal details with outside software vendors.

The data sharing, made public by European researchers on Saturday and reported by BuzzFeed on Monday, set off an outcry from many users. By Monday night, the company said it would stop sharing H.I.V. data with outside companies."

Saturday, March 31, 2018

Promises, promises: Facebook’s history with privacy; Washington Post, March 30, 2018

Ryan Nakashima, Washington Post; Promises, promises: Facebook’s history with privacy

"“We’ve made a bunch of mistakes.” ‘’Everyone needs complete control over who they share with at all times.” ‘’Not one day goes by when I don’t think about what it means for us to be the stewards of this community and their trust.”

Sound familiar? It’s Facebook CEO Mark Zuckerberg addressing a major privacy breach — seven years ago .

Lawmakers in many countries may be focused on Cambridge Analytica’s alleged improper use of Facebook data, but the social network’s privacy problems go back more than a decade. Here are some of the company’s most notable missteps and promises around privacy."

Wednesday, March 28, 2018

Beware the smart toaster: 18 tips for surviving the surveillance age; Guardian, March 28, 2018

Alex Hern and Arwa Mahdawi, Guardian; Beware the smart toaster: 18 tips for surviving the surveillance age

"Awareness of our digital footprint is one thing, but what are we to do about it? In the wake of the Facebook revelations, it’s clear that we can’t all keep clicking as usual if we value our privacy or our democracy. It’s still relatively early in the internet era and we are all still figuring it out as we go along. However, best practices when it comes to security and online etiquette are starting to emerge. Here’s a guide to some of the new rules of the internet."

Facebook Changing Privacy Controls As Criticism Escalates; The Two-Way, NPR, March 28, 2018

Yuki Noguchi, The Two-Way, NPR; Facebook Changing Privacy Controls As Criticism Escalates

"Facebook responded to intensifying criticism over its mishandling of user data Wednesday by announcing new features to its site that will give users more visibility and control over how their information is shared. The changes, rolling out in coming weeks, will also enable users to prevent the social network from sharing that information with advertisers and other third parties.

"Last week showed how much more work we need to do to enforce our policies and help people understand how Facebook works and the choices they have over their data," Facebook Chief Privacy Officer Erin Egan and Deputy General Counsel Ashlie Beringer wrote in a statement.

"We've heard loud and clear that privacy settings and other important tools are too hard to find and that we must do more to keep people informed," they said."

Are you ready? This is all the data Facebook and Google have on you; Guardian, March 28, 2018

Dylan Curran, Guardian; Are you ready? This is all the data Facebook and Google have on you

"Want to freak yourself out? I’m going to show just how much of your information the likes of Facebook and Google store about you without you even realising it."

Tuesday, March 27, 2018

The Six Data Privacy Principles of the GDPR; Corporate Counsel, March 26, 2018

Amy Lewis, Corporate Counsel; The Six Data Privacy Principles of the GDPR

"Data privacy and personal data breaches have been in the news a lot recently. Over the past few years, companies have been collecting and processing ever-increasing amounts of data about their customers, employees, and users. As personal data becomes more valuable, governments around the world have begun the debate surrounding whether this data collection should be limited in favor of individuals’ fundamental right to privacy.

The Global Data Protection Regulation (GDPR) is the European Union’s answer to these debates. This new regulation strives to take the decisions regarding some uses of personal data out of the hands of companies and return control to the individuals that the data refer to—the data subjects. Any company that has a European presence or handles European residents’ personal data is subject to the GDPR. These companies will likely need to upgrade their data security and privacy procedures to meet the personal data handling requirements of the GDPR.

The GDPR’s data privacy goals can be summarized in six personal data processing principles: Lawfulness, Fairness and Transparency; Purpose Limitation; Data Minimization; Accuracy; Integrity and Confidentiality; and Storage Limitation."

Thursday, March 22, 2018

It’s Time to Regulate the Internet; The Atlantic, March 21, 2018

Franklin Foer, The Atlantic; It’s Time to Regulate the Internet

"If we step back, we can see it clearly: Facebook’s business model is the evisceration of privacy. That is, it aims to adduce its users into sharing personal information—what the company has called “radical transparency”—and then aims to surveil users to generate the insights that will keep them “engaged” on its site and to precisely target them with ads. Although Mark Zuckerberg will nod in the direction of privacy, he has been candid about his true feelings. In 2010 he said, for instance, that privacy is no longer a “social norm.” (Once upon a time, in a fit of juvenile triumphalism, he even called people “dumb fucks” for trusting him with their data.) And executives in the company seem to understand the consequence of their apparatus. When I recently sat on a panel with a representative of Facebook, he admitted that he hadn’t used the site for years because he was concerned with protecting himself against invasive forces.

We need to constantly recall this ideological indifference to privacy, because there should be nothing shocking about the carelessness revealed in the Cambridge Analytica episode...

Facebook turned data—which amounts to an X-ray of the inner self—into a commodity traded without our knowledge."

Thursday, February 22, 2018

Tech's biggest companies are spreading conspiracy theories. Again.; CNN, February 21, 2018

Seth Fiegerman, CNN; Tech's biggest companies are spreading conspiracy theories. Again.

"To use Silicon Valley's preferred parlance, it's now hard to escape the conclusion that the spreading of misinformation and hoaxes is a feature, not a bug, of social media platforms -- and their business models.

Facebook and Google built incredibly profitable businesses by serving content they don't pay for or vet to billions of users, with ads placed against that content. The platforms developed better and better targeting to buoy their ad businesses, but not necessarily better content moderation to buoy user discourse."

Monday, May 15, 2017

Can you teach ethics to algorithms?; CIO, May 15, 2017

James Maclennan, CIO; 

Can you teach ethics to algorithms?


"The challenges of privacy

Addressing bias is a challenge, but most people understand that discrimination and bias are bad. What happens when we get into trickier ethical questions such as privacy?
Just look at Facebook and Google, two companies that have mountains of information on you. A recent report uncovered that Facebook “can figure out when people as young as 14 feel ‘defeated,’ ‘overwhelmed’ and ‘a failure.’” This information is gathered by a Facebook analysis system, and it is really easy how such information could be abused.
The fact that the information uncovered by such an algorithm could be so easily abused does not make the algorithm morally wrong. Facebook decided to create the algorithm without considering the ethical implications of manipulating depressed teenagers to buy more stuff, and thus the responsibility falls on Facebook and not the algorithm. 
Facebook at minimum needs to encourage its own technological staff to think about the ethical consequences of any new algorithm they construct. If Facebook and other technological companies fail to consider protecting user privacy by constructing algorithms, then the government may have to step in to ensure the peoples’ rights are protected."

Monday, April 24, 2017

Unroll.me head 'heartbroken' that users found out it sells their inbox data; Guardian, April 24, 2017

Alex Hern, Guardian; 

Unroll.me head 'heartbroken' that users found out it sells their inbox data

"Following the story, Unroll.me’s CEO and co-founder Jojo Hedaya wrote a corporate blogpost in which he expressed contrition. But while he said it was “heartbreaking”, he was not talking about the sale of customer data: instead, he said he felt bad “to see that some of our users were upset to learn about how we monetise our free service”.

He added: “the reality is most of us – myself included – don’t take the time to thoroughly review” terms of service agreements or privacy policies."

Tuesday, April 4, 2017

EFF Says No to So-Called “Moral Rights” Copyright Expansion; Electronic Frontier Foundation (EFF), March 30, 2017

Kerry Sheehan and Kit Walsh, Electronic Frontier Foundation (EFF): 

EFF Says No to So-Called “Moral Rights” Copyright Expansion


"The fight over moral rights, particularly the right of Integrity, is ultimately one about who gets to control the meaning of a particular work. If an author can prevent a use they perceive as a “prejudicial distortion” of their work, that author has the power to veto others’ attempts to contest, reinterpret, criticize, or draw new meanings from those works...

A statutory right of attribution could also interfere with privacy protective measures employed by online platforms. Many platforms strip identifying metadata from works on their platforms to protect their users' privacy, If doing so were to trigger liability for violating an author’s right of attribution, platforms would likely be chilled from protecting their users’ privacy in this way.

For centuries, American courts have grappled with how to address harm to reputation without impinging on the freedom of speech guaranteed by the First Amendment. And as copyright’s scope has expanded in recent decades, the courts have provided the safeguards that partially mitigate the harm of overly broad speech regulation."

Friday, November 25, 2016

Facebook doesn't need to ban fake news to fight it; Guardian,11/25/16

Alex Hern, Guardian; Facebook doesn't need to ban fake news to fight it:
"Those examples are the obvious extreme of Facebook’s problem: straightforward hoaxes, mendaciously claiming to be sites that they aren’t. Dealing with them should be possible, and may even be something the social network can tackle algorithmically, as it prefers to do.
But they exist at the far end of a sliding scale, and there’s little agreement on where to draw the line. Open questions like this explain why many are wary of pushing Facebook to “take action” against fake news. “Do we really want Facebook exercising this sort of top-down power to determine what is true or false?” asks Politico’s Jack Shafer. “Wouldn’t we be revolted if one company owned all the newsstands and decided what was proper and improper reading fare?”
The thing is, Facebook isn’t like the newsstands. And it’s the differences between the two that are causing many of the problems we see today."

Tuesday, September 27, 2016

Lessons from zombie warfare can help us beat hackers at their own game; Quartz, 9/26/16

Patrick Lin, Quartz; Lessons from zombie warfare can help us beat hackers at their own game:
"The current lack of respect for the power and vulnerabilities of our computing devices has helped create the debate over hacking back and other security issues. To be fair, the internet wasn’t designed for security when it was created decades ago, but only for a small group of researchers who trusted one another. That circle of trust has long been breached. We now need more vigilant and prepared users to help prevent cyberattacks from landing in the first place, making moot the decision to hack back.
Therefore, to truly address cybersecurity, we may need to seriously consider requiring computer users to have special training and licensing, or at the very least to keep up with basic hygiene requirements. Firearms and automobiles also have a high potential for misuse, so they require proper training and licensing. The US Federal Aviation Administration just required aerial drones to be registered, similarly recognizing that drone operation can be both recreational and dangerous.
Perhaps this solution is too radical to work. A new report on the ethics of hacking back, released today (Sept. 26) by the Ethics + Emerging Sciences Group based at Cal Poly, explores other possibilities. But a radical change of perspective may be what’s needed to solve such a relentless problem, and the right metaphor may be able to inspire that paradigm shift."

Wednesday, August 31, 2016

Am I a good person? You asked Google – here’s the answer; Guardian, 8/31/16

Andrew Brown, Guardian; Am I a good person? You asked Google – here’s the answer:
"The beginning of being a good person is the knowledge that you may not be, or that you have acted as a bad one would. After that it gets complicated.
The most obvious complication, perhaps, is that there is no agreement on what constitutes a good person. In fact there’s no agreement on whether we should even agree who a good person is...
The only certain thing about this question is that if you’ve never thought to ask it, the answer has to be “no”."

Saturday, June 11, 2016

New York Times Says Fair Use Of 300 Words Will Run You About $1800; New York Times, 6/10/16

Tim Cushing, TechDirt; New York Times Says Fair Use Of 300 Words Will Run You About $1800:
"Fair use is apparently the last refuge of a scofflaw. Following on the heels of a Sony rep's assertion that people could avail themselves of fair use for the right price, here comes the New York Times implying fair use not only does not exist, but that it runs more than $6/word.
Obtaining formal permission to use three quotations from New York Times articles in a book ultimately cost two professors $1,884. They’re outraged, and have taken to Kickstarter — in part to recoup the charges, but primarily, they say, to “protest the Times’ and publishers’ lack of respect for Fair Use.
These professors used quotes from other sources in their book about press coverage of health issues, but only the Gray Lady stood there with her hand out, expecting nearly $2,000 in exchange for three quotes totalling less than 300 words.
The professors paid, but the New York Times "policy" just ensures it will be avoided by others looking to source quotes for their publications. The high rate it charges (which it claims is a "20% discount") for fair use of its work will be viewed by others as proxy censorship. And when censorship of this sort rears its head, most people just route around it. Other sources will be sought and the New York Times won't be padding its bottom line with ridiculous fees for de minimis use of its articles.
The authors' Kickstarter isn't so much to pay off the Times, but more to raise awareness of the publication's unwillingness to respect fair use."