Showing posts with label controversial Facebook "emotion manipulation" research study. Show all posts
Showing posts with label controversial Facebook "emotion manipulation" research study. Show all posts

Wednesday, September 17, 2014

Data Scientists Want Big Data Ethics Standards; Information Week, 9/17/14

Jeff Bertolucci, Information Week; Data Scientists Want Big Data Ethics Standards:
"The vast majority of statisticians and data scientists believe that consumers should worry about privacy issues related to data being collected on them, and most have qualms about the questionable ethics behind Facebook's undisclosed psychological experiment on its users in 2012.
Those are just two of the findings from a Revolution Analytics survey of 144 data scientists at JSM (Joint Statistical Meetings) 2014, an annual gathering of statisticians, to gauge their thoughts on big data ethics. The Boston conference ran Aug. 2-7."

Wednesday, July 2, 2014

Facebook’s Secret Manipulation of User Emotions Faces European Inquiries; New York Times, 7/2/14

Facebook’s Secret Manipulation of User Emotions Faces European Inquiries:
"In response to widespread public anger, several European data protection agencies are examining whether Facebook broke local privacy laws when it conducted the weeklong investigation in January 2012.
That includes Ireland’s Office of the Data Protection Commissioner, which regulates Facebook’s global operations outside North America because the company has its international headquarters in Dublin. The Irish regulator has sent a series of questions to Facebook related to potential privacy issues, including whether the company got consent from users for the study, according to a spokeswoman.
The Information Commissioner’s Office of Britain also said that it was looking into potential privacy breaches that may have affected the country’s residents, though a spokesman of the office said that it was too early to know whether Facebook had broken the law. It is unknown where the users who were part of the experiment were located. Some 80 percent of Facebook’s 1.2 billion users are based outside North America...
The Federal Trade Commission, the American regulator that oversees Facebook’s conduct under a 20-year consent decree, has not publicly expressed similar interest in the case, which has caused an uproar over the company’s ethics and prompted the lead researcher on the project to apologize."

Here Are All the Other Experiments Facebook Plans to Run on You: An exclusive preview; Slate, 6/30/14

David Auerbach, Slate; Here Are All the Other Experiments Facebook Plans to Run on You: An exclusive preview:
"Facebook and two outside social scientists recently published a scientific paper in which they revealed that they had manipulated users’ news feeds to tweak their emotions. Since then, there has been a growing debate over the ethics and practice of Facebook experimenting on its users, as chronicled by Slate’s Katy Waldman. In response to these concerns, this morning Facebook issued the following press release—although I seem to be the only journalist who has received it. Coming hot on the heels of Facebook’s carefully unapologetic defense of its emotion research on its users, I share the press release as a glimpse of Facebook’s future directions in its user experiments."

Should Facebook Manipulate Users?: Jaron Lanier on Lack of Transparency in Facebook Study; New York Times, 6/30/14

Jaron Lanier, New York Times; Should Facebook Manipulate Users?: Jaron Lanier on Lack of Transparency in Facebook Study:
"Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. Facebook’s generic click-through agreement, which almost no one reads and which doesn’t mention this kind of experimentation, was the only form of consent cited in the paper. The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care.
This is only one early publication about a whole new frontier in the manipulation of people, and Facebook shouldn’t be singled out as a villain. All researchers, whether at universities or technology companies, need to focus more on the ethics of how we learn to improve our work.
To promote the relevance of their study, the researchers noted that emotion was relevant to human health, and yet the study didn’t measure any potential health effects of the controlled manipulation of emotions."

Facebook Study Sparks Soul-Searching and Ethical Questions: Incident Shines Light on How Companies, Researchers Tap Data Created Online; Wall Street Journal, 6/30/14

Reed Albergotti and Elizabeth Dwoskin, Wall Street Journal; Facebook Study Sparks Soul-Searching and Ethical Questions: Incident Shines Light on How Companies, Researchers Tap Data Created Online:

Tuesday, July 1, 2014

Facebook Added 'Research' To User Agreement 4 Months After Emotion Manipulation Study; Forbes, 6/30/14

Kashmir Hill, Forbes; Facebook Added 'Research' To User Agreement 4 Months After Emotion Manipulation Study:
"Unless you’ve spent the last couple of days in a Faraday pouch under a rock, you’ve heard about Facebook’s controversial ‘emotion manipulation’ study. Facebook data scientist Adam Kramer ran an experiment on 689,003 Facebook users two and a half years ago to find out whether emotions were contagious on the social network. It lasted for a week in January 2012. It came to light recently when he and his two co-researchers from Cornell University and University of California-SF published their study describing how users’ moods changed when Facebook curated the content of their News Feeds to highlight the good, happy stuff (for the lucky group) vs. the negative, depressing stuff (for the unlucky and hopefully-not-clinically-depressed group). The idea of Facebook manipulating users’ emotions for science — without telling them or explicitly asking them first — rubbed many the wrong way. Critics said Facebook should get “informed consent” for a study like this — asking people if they’re okay being in a study and then telling them what was being studied afterwards. Defenders said, “Hey, the Newsfeed gets manipulated all the time. What’s the big deal?” Critics and defenders alike pointed out that Facebook’s “permission” came from its Data Use Policy which among its thousands of words informs people that their information might be used for “internal operations,” including “research.” However, we were all relying on what Facebook’s data policy says now. In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that “research” is something that might happen on the platform.
Four months after this study happened, in May 2012, Facebook made changes to its data use policy, and that’s when it introduced this line about how it might use your information: “For internal operations, including troubleshooting, data analysis, testing, research and service improvement.”"