Tuesday, July 8, 2014

Should Germans Read ‘Mein Kampf’?; New York Times, 7/7/14

Peter Ross Range, New York Times; Should Germans Read ‘Mein Kampf’? :
"Since then, although “Mein Kampf” has maintained a shadow presence — on the back shelves of used bookstores and libraries and, more recently, online — its copyright holder, the state of Bavaria, has refused to allow its republication, creating an aura of taboo around the book.
All that is about to change. Bavaria’s copyright expires at the end of 2015; after that, anyone can publish the book: a quality publisher, a mass-market pulp house, even a neo-Nazi group.
The release of “Mein Kampf” into Germany’s cultural bloodstream is sure to be a sensational moment. In a nation that still avidly buys books — and loves to argue in public — the book will again ignite painful intergenerational debates on talk shows and in opinion pages about how parents and grandparents let themselves be so blindly misled.
Like the 1996 uproar caused by Daniel Jonah Goldhagen’s controversial book “Hitler’s Willing Executioners,” which accused ordinary Germans of being capable of mass-murdering Jews, this publishing event will shape contemporary politics and feed Germany’s deep-rooted postwar pacifism."

Friday, July 4, 2014

Big Data Comes To College; NPR, 7/4/14

Anya Kamenetz, NPR; Big Data Comes To College:
"So academics are scrambling to come up with rules and procedures for gathering and using student data—and manipulating student behavior.
"This is a huge opportunity for science, but it also brings very large ethical puzzles," says Dr. Mitchell Stevens, director of digital research and planning at Stanford University's Graduate School of Education. "We are at an unprecedented moment in the history of the human sciences, in which massive streams of information about human activity are produced continuously through online interaction."
Experts say the ethical considerations are lagging behind the practice. "There's a ton of research being done...[yet] if you do a search on ethics and analytics I think you'll get literally seven or eight articles," says Pistilli, who is the author of one of them.
Large Ethical Puzzles
In June, Stevens helped convene a gathering to produce a set of guidelines for this research. The Asilomar Convention was in the spirit of the Belmont Report of 1979, which created the rules in use today to evaluate research involving human subjects...
Asilomar came up with a set of broad principles that include "openness," "justice," and "beneficence." The final one is "continuous consideration," which, essentially, acknowledges that ethics remain a moving target in these situations."

Privacy Group Complains to F.T.C. About Facebook Emotion Study; New York Times, 7/3/14

Vindu Goel, New York Times; Privacy Group Complains to F.T.C. About Facebook Emotion Study:
"The group, the Electronic Privacy Information Center, said Facebook had deceived its users and violated the terms of a 2012 consent decree with the F.T.C., which is the principal regulatory agency overseeing consumer privacy in the United States...
And on Thursday, the journal that published the study, the Proceedings of the National Academy of Sciences, issued an “expression of concern” regarding Facebook’s decision not to get explicit consent from the affected users before running the study.
“Obtaining informed consent and allowing participants to opt out are best practices in most instances under the U.S. Department of Health and Human Services Policy for the Protection of Human Research Subjects,” Inder M. Verma, the journal’s editor-in-chief, wrote in the note.
Although academic researchers are generally expected to follow the policy, Facebook, as a private company, was not required to do so, Mr. Verma said. “It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out,” he said."

Thursday, July 3, 2014

Did Facebook's experiment violate ethics?; CNN, 7/2/14

Robert Klitzman, CNN; Did Facebook's experiment violate ethics? :
"Editor's note: Robert Klitzman is a professor of psychiatry and director of the Masters of Bioethics Program at Columbia University. He is author of the forthcoming book, "The Ethics Police?: The Struggle to Make Human Research Safe." The opinions expressed in this commentary are solely those of the author...
In 1974, following revelations of ethical violations in the Tuskegee Syphilis study, Congress passed the National Research Act. At Tuskegee, researchers followed African-American men with syphilis for decades and did not tell the subjects when penicillin became available as an effective treatment. The researchers feared that the subjects, if informed, would take the drug and be cured, ending the experiment.
Public outcry led to federal regulations governing research on humans, requiring informed consent. These rules pertain, by law, to all studies conducted using federal funds, but have been extended by essentially all universities and pharmaceutical and biotech companies in this country to cover all research on humans, becoming the universally-accepted standard.
According to these regulations, all research must respect the rights of individual research subjects, and scientific investigators must therefore explain to participants the purposes of the study, describe the procedures (and which of these are experimental) and "any reasonably foreseeable risks or discomforts."
Facebook followed none of these mandates. The company has argued that the study was permissible because the website's data use policy states, "we may use the information we receive about you...for internal operations, including troubleshooting, data analysis, testing, research and service improvement," and that "we may make friend suggestions, pick stories for your News Feed or suggest people to tag in photos."
But while the company is not legally required to follow this law, two of the study's three authors are affiliated with universities -- Cornell and the University of California at San Francisco -- that publicly uphold this standard."

Wednesday, July 2, 2014

Facebook’s Secret Manipulation of User Emotions Faces European Inquiries; New York Times, 7/2/14

Facebook’s Secret Manipulation of User Emotions Faces European Inquiries:
"In response to widespread public anger, several European data protection agencies are examining whether Facebook broke local privacy laws when it conducted the weeklong investigation in January 2012.
That includes Ireland’s Office of the Data Protection Commissioner, which regulates Facebook’s global operations outside North America because the company has its international headquarters in Dublin. The Irish regulator has sent a series of questions to Facebook related to potential privacy issues, including whether the company got consent from users for the study, according to a spokeswoman.
The Information Commissioner’s Office of Britain also said that it was looking into potential privacy breaches that may have affected the country’s residents, though a spokesman of the office said that it was too early to know whether Facebook had broken the law. It is unknown where the users who were part of the experiment were located. Some 80 percent of Facebook’s 1.2 billion users are based outside North America...
The Federal Trade Commission, the American regulator that oversees Facebook’s conduct under a 20-year consent decree, has not publicly expressed similar interest in the case, which has caused an uproar over the company’s ethics and prompted the lead researcher on the project to apologize."

Why I Left 60 Minutes: The big networks say they care about uncovering the truth. That’s not what I saw; Politico, 6/29/14

Charles Lewis, Politico; Why I Left 60 Minutes: The big networks say they care about uncovering the truth. That’s not what I saw:
"Many people, then and since, have asked me what exactly I was thinking—after all, I was walking away from a successful career full of future promise. Certainly, quitting 60 Minutes was the most impetuous thing I have ever done. But looking back, I realize how I’d changed. Beneath my polite, mild-mannered exterior, I’d developed a bullheaded determination not to be denied, misled or manipulated. And more than at any previous time, I had had a jarring epiphany that the obstacles on the way to publishing the unvarnished truth had become more formidable internally than externally. I joked to friends that it had become far easier to investigate the bastards—whoever they are—than to suffer through the reticence, bureaucratic hand-wringing and internal censorship of my employer.
In a highly collaborative medium, I had found myself working with overseers I felt I could no longer trust journalistically or professionally, especially in the face of public criticism or controversy—a common occupational hazard for an investigative reporter. My job was to produce compelling investigative journalism for an audience of 30 million to 40 million Americans. But if my stories generated the slightest heat, it was obvious to me who would be expendable. My sense of isolation and vulnerability was palpable.
The best news about this crossroads moment was that after 11 years in the intense, cutthroat world of network television news, I still had some kind of inner compass. I was still unwilling to succumb completely to the lures of career ambition, financial security, peer pressure or conventional wisdom.
Just weeks after I quit, I decided to begin a nonprofit investigation reporting organization—a place dedicated to digging deep beneath the smarminess of Washington’s daily-access journalism into the documents few reporters seemed to be reading, which I knew from experience would reveal broad patterns of cronyism, favoritism, personal enrichment and outrageous (though mostly legal) corruption. My dream was a journalistic utopia—an investigative milieu in which no one would tell me who or what not to investigate. And so I recruited two trusted journalist friends and founded the Center for Public Integrity. The Center’s first report, “America’s Frontline Trade Officials,” was an expanded version of the 60 Minutes “Foreign Agent” story. Not long after this report was published, President George H.W. Bush signed an executive order banning former trade officials from becoming lobbyists for foreign governments or corporations."

Online, the Lying Is Easy: In ‘Virtual Unreality,’ Charles Seife Unfriends Gullibility; New York Times, 7/1/14

[Book Review of Charles Seife's VIRTUAL UNREALITY: Just Because the Internet Told You, How Do You Know It’s True?] Dwight Garner, New York Times; Online, the Lying Is Easy: In ‘Virtual Unreality,’ Charles Seife Unfriends Gullibility:
"Mr. Seife’s new book, “Virtual Unreality,” is about how digital untruths spread like contagion across our laptops and smartphones. The author is unusually qualified to write on this subject, and not merely because his surname is nearly an anagram for “selfie.”
A professor of journalism at New York University, Mr. Seife is a battle-scarred veteran of the new info wars. When Wired magazine wanted to investigate the ethical lapses of its contributor Jonah Lehrer, for example, it turned to Mr. Seife, whose report pinned Mr. Lehrer, wriggling, to the plagiarism specimen board...
In “Virtual Unreality,” Mr. Seife delivers a short but striding tour of the many ways in which digital information is, as he puts it in a relatively rare moment of rhetorical overkill, “the most virulent, most contagious pathogen that humanity has ever encountered.”...
One of Mr. Seife’s bedrock themes is the Internet’s dismissal, for good and ill, of the concept of authority. On Wikipedia, your Uncle Iggy can edit the page on black holes as easily as Stephen Hawking can. Serious reporting, another form of authority, is withering because it’s so easy to cut and paste facts from other writers, or simply to provide commentary, and then game search engine results so that readers find your material first."

On the Next Docket: How the First Amendment Applies to Social Media; New York Times, 6/30/14

Adam Liptak, New York Times; On the Next Docket: How the First Amendment Applies to Social Media:
"Mr. Elonis was convicted under a federal law that makes it a crime to communicate “any threat to injure the person of another.” The sentence was 44 months.
The case is one of many recent prosecutions “for alleged threats conveyed on new media, including Facebook, YouTube and Twitter,” according to a brief supporting Mr. Elonis from several First Amendment groups.
In urging the Supreme Court not to hear Mr. Elonis’s case, the Justice Department said his intent should make no difference. A perceived threat creates “fear and disruption,” the brief said, “regardless of whether the speaker subjectively intended the statement to be innocuous.”
Mr. Elonis’s lawyers did not deny that their approach would allow some statements with “undesirable effects.” But they said the First Amendment should tolerate those effects rather than “imprisoning a person for negligently misjudging how others would construe his words.”
The First Amendment does not protect all speech. There are exceptions for libel, incitement, obscenity and fighting words, and one for “true threats,” which is at issue in Mr. Elonis’s case."

Here Are All the Other Experiments Facebook Plans to Run on You: An exclusive preview; Slate, 6/30/14

David Auerbach, Slate; Here Are All the Other Experiments Facebook Plans to Run on You: An exclusive preview:
"Facebook and two outside social scientists recently published a scientific paper in which they revealed that they had manipulated users’ news feeds to tweak their emotions. Since then, there has been a growing debate over the ethics and practice of Facebook experimenting on its users, as chronicled by Slate’s Katy Waldman. In response to these concerns, this morning Facebook issued the following press release—although I seem to be the only journalist who has received it. Coming hot on the heels of Facebook’s carefully unapologetic defense of its emotion research on its users, I share the press release as a glimpse of Facebook’s future directions in its user experiments."

Facebook experiment may have broken UK law; Aljazeera, 7/2/14

Aljazeera; Facebook experiment may have broken UK law:
"A British data regulator has been investigating whether Facebook Inc broke data protection laws when it allowed researchers to conduct a psychological experiment on nearly 700,000 users of the social network, the Financial Times reported.
The Information Commissioner's Office (ICO), which monitors how personal data is used, is probing the controversial experiment and plans to ask Facebook questions, the newspaper reported on Tuesday."

Facebook’s experiment is just the latest to manipulate you in the name of research; Pew Research Center, 7/2/14

Rich Morin, Pew Research Center; Facebook’s experiment is just the latest to manipulate you in the name of research:
"But is what Facebook did ethical? There is a good amount of discussion about whether Facebook was transparent enough with its users about this kind of experimentation. They did not directly inform those in the study that they were going to be used as human lab rats. In academic research, that’s called not obtaining “informed consent” and is almost always a huge no-no. (Facebook claims that everyone who joins Facebook agrees as part of its user agreement to be included in such studies.)
The question is now about how, sitting on troves of new social media and other digital data to mine for the same kind of behavioral analysis, the new rules will need to be written.
Experimental research is rife with examples of how study participants have been manipulated, tricked or outright lied to in the name of social science. And while many of these practices have been curbed or banned in academe, they continue to be used in commercial and other types of research."

Should Facebook Manipulate Users?: Jaron Lanier on Lack of Transparency in Facebook Study; New York Times, 6/30/14

Jaron Lanier, New York Times; Should Facebook Manipulate Users?: Jaron Lanier on Lack of Transparency in Facebook Study:
"Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. Facebook’s generic click-through agreement, which almost no one reads and which doesn’t mention this kind of experimentation, was the only form of consent cited in the paper. The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care.
This is only one early publication about a whole new frontier in the manipulation of people, and Facebook shouldn’t be singled out as a villain. All researchers, whether at universities or technology companies, need to focus more on the ethics of how we learn to improve our work.
To promote the relevance of their study, the researchers noted that emotion was relevant to human health, and yet the study didn’t measure any potential health effects of the controlled manipulation of emotions."

Facebook Study Sparks Soul-Searching and Ethical Questions: Incident Shines Light on How Companies, Researchers Tap Data Created Online; Wall Street Journal, 6/30/14

Reed Albergotti and Elizabeth Dwoskin, Wall Street Journal; Facebook Study Sparks Soul-Searching and Ethical Questions: Incident Shines Light on How Companies, Researchers Tap Data Created Online:

Tuesday, July 1, 2014

Facebook Added 'Research' To User Agreement 4 Months After Emotion Manipulation Study; Forbes, 6/30/14

Kashmir Hill, Forbes; Facebook Added 'Research' To User Agreement 4 Months After Emotion Manipulation Study:
"Unless you’ve spent the last couple of days in a Faraday pouch under a rock, you’ve heard about Facebook’s controversial ‘emotion manipulation’ study. Facebook data scientist Adam Kramer ran an experiment on 689,003 Facebook users two and a half years ago to find out whether emotions were contagious on the social network. It lasted for a week in January 2012. It came to light recently when he and his two co-researchers from Cornell University and University of California-SF published their study describing how users’ moods changed when Facebook curated the content of their News Feeds to highlight the good, happy stuff (for the lucky group) vs. the negative, depressing stuff (for the unlucky and hopefully-not-clinically-depressed group). The idea of Facebook manipulating users’ emotions for science — without telling them or explicitly asking them first — rubbed many the wrong way. Critics said Facebook should get “informed consent” for a study like this — asking people if they’re okay being in a study and then telling them what was being studied afterwards. Defenders said, “Hey, the Newsfeed gets manipulated all the time. What’s the big deal?” Critics and defenders alike pointed out that Facebook’s “permission” came from its Data Use Policy which among its thousands of words informs people that their information might be used for “internal operations,” including “research.” However, we were all relying on what Facebook’s data policy says now. In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that “research” is something that might happen on the platform.
Four months after this study happened, in May 2012, Facebook made changes to its data use policy, and that’s when it introduced this line about how it might use your information: “For internal operations, including troubleshooting, data analysis, testing, research and service improvement.”"