Showing posts with label informed consent. Show all posts
Showing posts with label informed consent. Show all posts

Wednesday, November 28, 2018

Facing Backlash, Chinese Scientist Defends Gene-Editing Research On Babies; NPR, November 28, 2018

Rob Stein, NPR; Facing Backlash, Chinese Scientist Defends Gene-Editing Research On Babies

"University of Wisconsin bioethicist Alta Charo, who helped organize the summit, issued an even harsher critique of He's work, calling it "misguided, premature, unnecessary and largely useless."

"The children were already at virtually no risk of contracting HIV, because it was the father and not the mother who was infected," she said.

"The patients were given a consent form that falsely stated this was an AIDS vaccine trial, and which conflated research with therapy by claiming they were 'likely' to benefit," Charo said. "In fact there is not only very little chance these babies would be in need of a benefit, given their low risk, but there is no way to evaluate if this indeed conferred any benefit."

She spoke after Harvard Medical School Dean George Daley alluded to He's claims as "missteps" that he worried might set back a highly promising field of research. "Scientists who go rogue carry a deep, deep cost to the scientific community," Daley said.

Still, Daley argued that He's experiment shouldn't tar the potential work of other scientists. "Just because the first steps into a new technology are missteps, doesn't mean we shouldn't step back, restart and think about a plausible and responsible path forward," Daley said.

"The fact that the first instance came forward as a misstep should in no way leave us to stick our heads in the sand and not consider the very, very positive efforts that could come forward," Daley said. "I hope we just don't stick our heads in the sand."

Daley stressed that the world hadn't yet reached a scientific consensus on how to ethically and safely use new gene-editing techniques to modify embryos that become babies.

But Daley argued that a consensus was emerging that "if we can solve the scientific challenges, it may be a moral imperative that it should be permitted." The most likely first legitimate use of gene-edited embryos would be to prevent serious genetic disorders for which there are no alternatives, Daley said.

"Solving and assessing these deep issues [is] essential," Daley says.

Daley also defended the fact that scientists have long relied on self-regulation to prevent the abuse of new technologies. He's claims represented "a major failure" that called for much stronger regulation and possibly a moratorium on such research, Daley said. "I do think the principle of self-regulation is defensible.""

Monday, November 5, 2018

Data Self-Autonomy and Student Data Analytics; Kip Currier, November 5, 2018

Kip Currier: Data Self-Autonomy and Student Data Analytics

When I first saw this PittWire article's headline (Pitt Sets Course for Student Success With Inaugural Advanced Analytics Summit) about Pitt's first-ever Advanced Analytics Summit, my initial thought was, "will the article address the potential downsides of student data analytics?"

Certainly, there are some benefits potentially offered by analysis of data generated about students. Chief among them, greater self-awareness by students themselves; assuming that students are given the opportunity to access data collected about themselves. (Let's remember, also--as surprising as it may seem to digital cognoscenti--that not everyone may want to know and see the kinds of educational data that is generated and collected about themselves in the digital age, just as biomedical providers, ethicists, and users have been debating the thorny issues implicated by the right to know and not know one's own medical information (see here and here, as some examples of varying perspectives about whether to know-or-not-know your own genetic information.) Some among those who do see their educational data analytics may still want to elect to opt out of future collection and use of their personal data.

(Aside: Consider that most U.S. consumers currently have no statutorily-mandated and enforceable rights to opt out of data collection on themselves, or to view and make informed decisions about the petabytes of information collected about them. Indeed, at a privacy conference in Brussels recently, Apple CEO Tim Cook excoriated tech companies for the ways that "personal information is being "weaponized against us with military efficiency.""

Contrast this with the European Union's game-changing 2018 General Data Protection Regulation. 

Perennial consumer protection leader California, with its legislature's passage of the most stringent consumer data privacy protection law in the nation and signing into law on September 23, 2018 by California Governor Jerry Brown, was recently sued by the U.S. Department of Justice for that law's adoption.

All the more reason that a recent Forbes article author exhorts "Why "Right To Delete" Should Be On Your IT Agenda Now".)

Having qualified persons to guide students in interpreting and deciding if and how to operationalize data about themselves is crucial. (Student Data Analytics Advisor, anyone?) In what ways can students, and those entrusted to advise them, use data about themselves to make the best possible decisions, during their time as students, as well as afterwards in their personal and professional lives. As Pitt Provost Ann Cudd is quoted in the article:
“Two of our main objectives are raising graduation rates and closing achievement gaps. Things to focus on are excellence in education, building a network and identifying and pursuing life goals and leading a life of impact.”

Kudos that Provost Cudd, as reported in the article, explicitly acknowledged "that as advanced analytics moves forward at the University, two topics of focus include identifying whether the use of data is universally good and what potential dangers exist, and how to keep the human components to avoid generalizing." The overarching, driving focus of student data analytics must always be on what is best for the individual, the student, the human being.

It was good to see that data privacy and cybersecurity issues were identified at the summit as significant concerns. These are HUGE issues with no magic bullets or easy answers. In an age in which even the Pentagon and White House are not innoculated from documented cyberintrusions, does anyone really feel 100% sure that student data won't be breached and misused?

Disappointingly, the article sheds little light on the various stakeholder interests who are eager to harvest and harness student data. As quoted at the end of the article, Stephen Wisniewski, Pitt's Vice Provost for Data and Information, states that "The primary reason is to better serve our students". Ask yourself, is "better serving students" Google's primary reason for student data analytics? Or a third party vendor? Or the many other parties who stand to benefit from student data analytics? Not just in higher education settings, but in K-12 settings as well. It's self-evident that the motivations for student "advanced analytics" are more complex and nuanced than primarily "better serving students".


As always, when looking at ethical issues and engaging with ethical decision-making, it's critically important to identify the stakeholder interests. To that end, when looking at the issue of student data analytics, we must identify who all of the actual and potential stakeholders are and then think about what their respective interests are, in order to more critically assess the issues and holistically apprise, understand, and make highly informed decisions about the potential risks and benefits. And, as I often remind myself, what people don't say is often just as important, if not sometimes more important and revealing, than what they do say.
Any mention of "informed consent", with regard to data collection and use, is noticeably absent from this article's reporting, though it hopefully was front and center at the summit.

Student data analytics offer some tantalizing possibilities for students to better know thyself. And for the educational institutions that serve them to better know--with the goal of better advising--their students, within legally bound and yet-to-be-bound limits, human individual-centered policies, and ethically-grounded guardrails that are built and reinforced with core values.

It's paramount, too, amidst our all-too-frequent pell-mell rush to embrace new technologies with sometimes utopian thinking and breathless actions, that we remember to take some stabilizing breaths and think deeply and broadly about the ramifications of data collection and use and the choices we can make about what should and should not be done with data--data about ourselves. Individual choice should be an essential part of student data analytics. Anything less places the interests of the data above the interests of the individual.

Thursday, November 1, 2018

Medicine and ethics: Will we learn to take research scandals seriously?; Star Tribune, October 29, 2018

Carl Elliott, Star Tribune; Medicine and ethics: Will we learn to take research scandals seriously?

"“The Experiments” is a cautionary tale of how the refusal of institutional leaders to look honestly at ethical problems can lead to the deaths of unsuspecting patients. And while the jury is still out as to whether the Karolinska Institute will reform itself, at least the Swedish public and concerned politicians are trying to hold the institution accountable. 

That is more than we can claim for Minnesota. As they say in the rehabilitation units: The first step to recovery is admitting you have a problem."

Thursday, October 11, 2018

Do We Need To Teach Ethics And Empathy To Data Scientists?; Forbes, October 8, 2018

Kalev Leetaru, Forbes; Do We Need To Teach Ethics And Empathy To Data Scientists?

[Kip Currier: A thought-provoking and timely piece, especially as I'm presently writing a chapter on research ethics for my ethics textbook and was just reviewing and thinking about the history of informed consent and Institutional Review Boards-cum-Human-Research-Protection-Offices. Medical ethics lapses like those involving Henrietta Lacks and the Tuskegee Syphilis Study are potent reminders of the concomitant imperative for ethics oversight and informed consent vis-a-vis digital age research.]

"The growing shift away from ethics and empathy in the creation of our digital future is both profoundly frightening for the Orwellian world it is ushering in, but also a sad commentary on the academic world that trains the data scientists and programmers that are shifting the online world away from privacy. How might the web change if we taught ethics and empathy as primary components of computer science curriculums?

One of the most frightening aspects of the modern web is the speed at which it has struck down decades of legislation and professional norms regarding personal privacy and the ethics of turning ordinary citizens into laboratory rats to be experimented on against their wills. In the space of just two decades the online world has weaponized personalization and data brokering, stripped away the last vestiges of privacy, centralized control over the world’s information and communications channels, changed the public’s understanding of the right over their digital selves and profoundly reshaped how the scholarly world views research ethics, informed consent and the right to opt out of being turned into a digital guinea pig.

It is the latter which in many ways has driven each of the former changes. Academia’s changing views towards IRB and ethical review has produced a new generation of programmers and data scientists who view research ethics as merely an outdated obsolete historical relic that was an obnoxious barrier preventing them from doing as they pleased to an unsuspecting public."

Tuesday, September 11, 2018

You Discovered Your Genetic History. Is It Worth the Privacy Risk?; Fortune, September 10, 2018

Monica Rodriguez, Fortune; You Discovered Your Genetic History. Is It Worth the Privacy Risk?

"Direct-to-consumer genetic testing companies like 23andMe must win FDA approval to send individuals medical risk findings, while companies that involve physicians in the process do not. But unlike healthcare providers, direct-to-consumer genetic testing companies are not bound by HIPPA, the law that protects the privacy of personal medical information, and there are few laws in place to regulate the privacy of genetic information obtained by these companies.

“One of the big distinctions between medical research and data in Silicon Valley is the ethical framework that requires informed consent,” said Charles Seife, a professor of journalism at New York University who writes extensively on the genetic testing industry. “It is a difference of making sure that [privacy] rights are being preserved.”"

Thursday, May 24, 2018

New privacy rules could spell the end of legalese — or create a lot more fine print; The Washington Post, May 24, 2018

Elizabeth DwoskinThe Washington Post; New privacy rules could spell the end of legalese — or create a lot more fine print

"“The companies are realizing that it is not enough to get people to just click through,” said Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University and the U.S. Federal Trade Commission’s former chief technologist. “That they need to communicate so that people are not surprised when they find out what they consented to.”

That has become more apparent in the past two months since revelations that a Trump-connected consultancy, Cambridge Analytica, made off with the Facebook profiles of up to 87 million Americans. Cranor said that consumer outrage over Cambridge was directly related to concerns that companies were engaging in opaque practices behind the scenes, and that consumers had unknowingly allowed it to happen by signing away their rights.

Irrespective of simpler explanations, the impact and success of the GDPR will hinge upon whether companies will try to force users to consent to their tracking or targeting as condition for access to their services, said Alessandro Acquisti, a Carnegie Mellon computer science professor and privacy researcher. "This will tell us a lot regarding whether the recent flurry of privacy policy modifications demonstrates a sincere change in the privacy stance of those companies or is more about paying lip service to the new regulation. The early signs are not auspicious.""

Thursday, April 5, 2018

Sorry, Facebook was never ‘free’; The New York Post, March 21, 2018

John Podhoretz, The New York Post; Sorry, Facebook was never ‘free’


[Kip Currier: On today's MSNBC Morning Joe show, The New York Post's John Podhoretz pontificated on the same provocative assertions that he wrote about in his March 21, 2018 opinion piece, excerpted below. It’s a post-Cambridge Analytica “Open Letter polemic” directed at anyone (--or using Podhoretz’s term, any fool) who signed up for Facebook “back in the day” and who may now be concerned about how free social media sites like Facebook use—as well as how Facebook et al enable third parties to “harvest”, “scrape”, and leverage—people’s personal data.

Podhoretz’s argument is flawed on so many levels it’s challenging to know where to begin. (Full disclosure: As someone working in academia in a computing and information science school, who signed up for Facebook some years ago to see what all the “fuss” was about, I’ve never used my Facebook account because of ongoing privacy concerns about it. Much to the chagrin of some family and friends who have exhorted me, unsuccessfully, to use it.)

Certainly, there is some level of “ownership” that each of us needs to take when we sign up for a social media site or app by clicking on the Terms and Conditions and/or End User License Agreement (EULA). But it’s also common knowledge now (ridiculed by self-aware super-speed-talking advertisers in TV and radio ads!) that these agreements are written in legalese that don’t fully convey the scope and potential scope of the ramifications of these agreements’ terms and conditions. (Aside: For a clever satirical take on the purposeful impenetrability and abstruseness of these lawyer-crafted agreements, see R. Sikoryak’s 2017 graphic novel Terms and Conditions, which visually lampoons an Apple iTunes user contract.)

Over the course of decades, for example, in the wake of the Tuskegee Syphilis experiments and other medical research abuses and controversies, medical research practitioners were legally coerced to come to terms with the fact that laws, ethics, and policies about “informed consent” needed to evolve to better inform and protect “human subjects” (translation: you and me).

A similar argument can be made regarding Facebook and its social media kin: namely, that tech companies and app developers need to voluntarily adopt (or be required to adopt) HIPAA-esque protections and promote more “informed” consumer awareness.

We also need more computer science ethics training and education for undergraduates, as well as more widespread digital citizenship education in K-12 settings, to ensure a level playing field of digital life awareness. (Hint, hint, Education Secretary Betsy DeVos or First Lady Melania Trump…here’s a mission critical for your patronage.)

Podhoretz’s simplistic Facebook user-as-deplorable-fool rant puts all of the blame on users, while negating any responsibility for bait-and-switch tech companies like Facebook and data-sticky-fingered accomplices like Cambridge Analytica. “Free” doesn’t mean tech companies and app designers should be free from enhanced and reasonable informed consent responsibilities they owe to their users. Expecting or allowing anything less would be foolish.]


"The science fiction writer Robert A. Heinlein said it best: “There ain’t no such thing as a free lunch.” Everything has a cost. If you forgot that, or refused to see it in your relationship with Facebook, or believe any of these things, sorry, you are a fool. So the politicians and pundits who are working to soak your outrage for their own ideological purposes are gulling you. But of course you knew.

You just didn’t care . . . until you cared. Until, that is, you decided this was a convenient way of explaining away the victory of Donald Trump in the 2016 election.

You’re so invested in the idea that Trump stole the election, you are willing to believe anything other than that your candidate lost because she made a lousy argument and ran a lousy campaign and didn’t know how to run a race that would put her over the top in the Electoral College — which is how you prevail in a presidential election and has been for 220-plus years.

The rage and anger against Facebook over the past week provide just the latest examples of the self-infantilization and flight from responsibility on the part of the American people and the refusal of Trump haters and American liberals to accept the results of 2016.

Honestly, it’s time to stop being fools and start owning up to our role in all this."

Monday, March 19, 2018

Where's Zuck? Facebook CEO silent as data harvesting scandal unfolds; Guardian, March 19, 2018

Julia Carrie Wong, Guardian; Where's Zuck? Facebook CEO silent as data harvesting scandal unfolds


[Kip Currier: Scott Galloway, clinical professor of marketing at the New York University Stern School of Business, made some strong statements about the Facebook/Cambridge Analytica data harvesting scandal on MSNBC's Stephanie Ruhle show yesterday.

Regarding Facebook's handling of the revelations to date:

"This is a textbook example of how not to handle a crisis."

He referred to Facebook's leadership as "tone-deaf management" that initially denied a breach had occurred, and then subsequently deleted Tweets saying that it was wrong to call what had occurred a breach.

Galloway also said that "Facebook has embraced celebrity but refused to embrace its responsibilities". He contrasted Facebook's ineffectual current crisis management to how Johnson & Johnson demonstrated decisive leadership and accountability during the "tampered Tylenol bottles" crisis the latter faced in the 1980's.]



"The chief executive of Facebook, Mark Zuckerberg, has remained silent over the more than 48 hours since the Observer revealed the harvesting of 50 million users’ personal data, even as his company is buffeted by mounting calls for investigation and regulation, falling stock prices, and a social media campaign to #DeleteFacebook...

Also on Monday, the New York Times reported that Facebook’s chief security officer, Alex Stamos, would be leaving the company following disagreements with other executives over the handling of the investigation into the Russian influence operation...

Stamos is one of a small handful of Facebook executives who addressed the data harvesting scandal on Twitter over the weekend while Zuckerberg and Facebook’s chief operating officer, Shery Sandberg, said nothing."

Monday, May 15, 2017

Speaker's Corner: Privacy needs better protection; Law Times, May 15, 2017

Nathaniel Erskine-Smith, Law Times; 

Speaker's Corner: Privacy needs better protection


"There are also concerns that our current model of informed consent needs updating. The majority of Canadians admit to not reading privacy policies for mobile apps, and a recent privacy sweep — in which 25 privacy enforcement authorities participated — found that privacy communications of Internet-connected devices are generally poor and fail to inform users about exactly what personal information is being collected and how it will be used. It is difficult to reconcile these facts with the goal of meaningful consent.

This is especially important as more devices collect more information about our lives. From smart meters that track our energy consumption to fridges that track what we eat, Cisco Systems estimates there will be 50 billion connected devices by 2020. As a consumer, I want convenience and will trade some of my privacy. As a citizen and as a lawyer, I want laws that substantively protect my privacy. 

In general terms, we should mandate privacy by design. Governments and third parties ought to anonymize our personal information, and our government should follow Australia’s example and make it an offence to re-identify published government data sets. We should also look beyond the law to protect our data. 

Take Estonia. On the one hand, it has embraced big data through maintaining a national register with a single unique identifier for all citizens and residents. Customer service is improved and information is exchanged more easily. On the other hand, the same system ensures that citizens can correct or remove data easily and can see which officials have viewed their data. 

In summary, we need to embrace new laws and new technology. We need not sacrifice our privacy."

Monday, March 20, 2017

San people of Africa draft code of ethics for researchers; Science, March 17, 2017

Linda Nordling, Science; 

San people of Africa draft code of ethics for researchers


"Earlier this month the group unveiled a code of ethics for researchers wishing to study their culture, genes, or heritage.
The code, published here on 3 March, asks researchers to treat the San respectfully and refrain from publishing information that could be viewed as insulting. Because such sensitivities may not be clear to researchers, the code asks that scientists let communities read and comment on findings before they are published. It also asks that researchers keep their promises and give something back to the community in return for its cooperation...
The code does not place unrealistic demands on scientists, says Himla Soodyall, director of the Human Genomic Diversity and Disease Research Unit at South Africa’s University of the Witwatersrand in Johannesburg. But others point out that the code focuses on past transgressions, and doesn’t refer to recent efforts to respect and involve communities, such as guidelines for genomics work on vulnerable populations prepared in 2014 by the Human Heredity and Health in Africa program. As a result, the code may present an overly negative view of researchers and discourage communities from participating in studies, says Charles Rotimi, founding director of the National Institutes of Health Center for Research on Genomics and Global Health in Bethesda, Maryland."

Wednesday, March 15, 2017

Vibrator Maker To Pay Millions Over Claims It Secretly Tracked Use; NPR, March 14, 2017

Camila Domonoske, NPR; 

Vibrator Maker To Pay Millions Over Claims It Secretly Tracked Use

"The makers of the We-Vibe, a line of vibrators that can be paired with an app for remote-controlled use, have reached a $3.75 million class action settlement with users following allegations that the company was collecting data on when and how the sex toy was used...

The lawyers for the anonymous plaintiffs contended that the app, "incredibly," collected users' email addresses, allowing the company "to link the usage information to specific customer accounts."...

Standard Innovation also agreed to stop collecting users' email addresses and to update its privacy notice to be clearer about how data is collected."

Wednesday, February 15, 2017

Henrietta Lacks’s family wants compensation for her cells; Washington Post, February 14, 2017

Andrea K. McDaniels, Washington Post; Henrietta Lacks’s family wants compensation for her cells

"Francis Lanasa, the attorney who will represent the family, said that he would use a “continuing tort” argument, alleging that Hopkins had continued to violate the “personal rights, privacy and body parts” of Henrietta Lacks over time.

“They are literally the foundation of modern medical science,” Lanasa said of the cells."

Sunday, August 14, 2016

Ethical questions raised in search for Sardinian centenarians' secrets; Guardian, 8/12/16

Stephanie Kirchgaessner, Guardian; Ethical questions raised in search for Sardinian centenarians' secrets:
"Some say thousands of Sardinian research subjects never agreed that their samples could be sold or used by a for-profit company when they signed a medical consent agreement at the time the database was accumulating samples...
The conflict has raised the kind of thorny ethical questions that are likely to become more pervasive as scientists tap into the promise of massive DNA databases to learn more about disease. Should a private company be able to profit from the study of a population’s DNA, when the DNA was voluntarily donated? The deal also raises uncomfortable questions for local critics: why did Shardna go bust to begin with?...
The question now is whether participants like Maria Tegas or her children will ever gain from the research that has put her corner of Sardinia on the map. When Cerrone was asked whether he believed Sardinians ought to benefit in the future from any potentially lucrative medical advancements that might emerge, the executive demurs. Tiziana has already “given back” to the community when it bought the database for €258,000, including all the outstanding debts."

Sunday, July 3, 2016

Science Academies Blast U.S. Government's Planned Research-Ethics Reforms; Scientific American, 6/30/16

Sara Reardon, Scientific American; Science Academies Blast U.S. Government's Planned Research-Ethics Reforms:
"The US government’s proposed overhaul of regulations that govern research with human subjects is flawed and should be withdrawn, an independent advisory panel said today.
The regulations, which are known collectively as the ‘Common Rule’, address ethical issues such as informed consent and storage of study participants’ biological specimens. In its report on June 29, the US National Academies of Sciences, Engineering and Medicine said that the government’s proposed changes are “marred by omissions and a lack of clarity”, and would slow research while doing little to improve protections for patients enrolled in studies. Instead, the panel recommends that the government appoint an independent commission to craft new rules for such research.
The Common Rule, which was introduced in 1991, is based on the Belmont Report, a 1978 document that lays out principles for ethical research with humans, such as minimizing patient harm and maximizing the benefit of such research to society. Over time, achieving such goals has become more complex because of technological advances—such as the rise of DNA identification and shared databases, which can make it harder to maintain patient privacy."

Saturday, April 16, 2016

Oxford professor calls for European ethical codes on patient data; Guardian, 4/12/16

Paul Hill, Guardian; Oxford professor calls for European ethical codes on patient data:
"Prof Luciano Floridi, director of research at Oxford University’s Internet Institute believes the time has come for new European ethical codes to govern “data donation” and its use for medical research.
He says debate in Europe over individual privacy versus societal benefits of shared data has been “swinging like a pendulum between two extremes”. Medical research with big data should be part of the future of Europe, according to Floridi, “not something we need to export to other countries because it is not do-able here”.
“The patient has to be informed and willing to share the information that researchers are collecting – for the benefit of the patient and anyone else affected by the same problems,” said Floridi, who is also chair of the Ethics Advisory Board of the European Medical Information Framework, the largest EU project on the unification of biomedical databases...
Floridi, who has advised Google on the ethics of information and the right to be forgotten, proposes the creation of two new ethical codes.
The first would govern the use and re-use of biomedical data in Europe – an ethical code from the practitioners’ perspective.
The second would relate to “data donation” and the informed choice of an individual to share personal information for research."

Wednesday, April 13, 2016

Making the Most of Clinical Trial Data; New York Times, 4/12/16

Editorial Board, New York Times; Making the Most of Clinical Trial Data:
"Some researchers may oppose sharing data they have worked hard to gather, or worry that others will analyze it incorrectly. Creating opportunities for collaboration on subsequent analysis may help alleviate these concerns.
Of course, any data sharing must take patients’ privacy into account; patients must be informed before joining a clinical trial that their data may be shared and researchers must ensure that the data cannot be used to identify individuals.
By making data available and supporting analysis, foundations, research institutions and drug companies can increase the benefit of clinical trials and pave the way for new findings that could help patients."

Friday, March 25, 2016

Education Data, Student Privacy Take Spotlight at Capitol Hill Hearing; Education Week, 3/22/16

Andrew Ujifusa, Education Week; Education Data, Student Privacy Take Spotlight at Capitol Hill Hearing:
"Members of Congress weighed the concerns of parents, researchers, and educators about the sensitive intersection of education data and student privacy at a House education committee hearing Tuesday.
Among the topics: parents' desire for transparency and more control over what data is collected and how it's used; the need for researchers to have comprehensive and varied data; and the work states have done to try to safeguard the data they collect, while ensuring its usefulness to schools.
The hearing didn't take place in a vacuum. Over the past year, several lawmakers have taken a crack at revamping federal rules for how states and districts have to handle sensitive student information. The Family Educational Rights and Privacy Act, passed in 1974, is widely seen as outdated because of its limited definition of a "student record" in a world where states, educational service vendors, and others are gathering new and diverse types of data about students."

Wednesday, February 17, 2016

Balancing Benefits and Risks of Immortal Data Participants’ Views of Open Consent in the Personal Genome Project; Hastings Center Report, 12/17/15

Oscar A. Zarate, Julia Green Brody, Phil Brown, Monica D. Ramirez-Andreotta, Laura Perovich andJacob Matz, Hastings Center Report; Balancing Benefits and Risks of Immortal Data: Participants’ Views of Open Consent in the Personal Genome Project:
"Abstract
An individual's health, genetic, or environmental-exposure data, placed in an online repository, creates a valuable shared resource that can accelerate biomedical research and even open opportunities for crowd-sourcing discoveries by members of the public. But these data become “immortalized” in ways that may create lasting risk as well as benefit. Once shared on the Internet, the data are difficult or impossible to redact, and identities may be revealed by a process called data linkage, in which online data sets are matched to each other. Reidentification (re-ID), the process of associating an individual's name with data that were considered deidentified, poses risks such as insurance or employment discrimination, social stigma, and breach of the promises often made in informed-consent documents. At the same time, re-ID poses risks to researchers and indeed to the future of science, should re-ID end up undermining the trust and participation of potential research participants.
The ethical challenges of online data sharing are heightened as so-called big data becomes an increasingly important research tool and driver of new research structures. Big data is shifting research to include large numbers of researchers and institutions as well as large numbers of participants providing diverse types of data, so the participants’ consent relationship is no longer with a person or even a research institution. In addition, consent is further transformed because big data analysis often begins with descriptive inquiry and generation of a hypothesis, and the research questions cannot be clearly defined at the outset and may be unforeseeable over the long term. In this article, we consider how expanded data sharing poses new challenges, illustrated by genomics and the transition to new models of consent. We draw on the experiences of participants in an open data platform—the Personal Genome Project—to allow study participants to contribute their voices to inform ethical consent practices and protocol reviews for big-data research."

Friday, January 29, 2016

Karolinska Institute may reopen ethics inquiry into work of pioneering surgeon; Science, 1/29/16

Gretchen Vogel, Science; Karolinska Institute may reopen ethics inquiry into work of pioneering surgeon:
"A documentary on Swedish Television (SVT) has prompted the Karolinska Institute (KI) in Stockholm to consider reopening its investigation into possible misconduct by surgeon Paolo Macchiarini. After an investigation last year into Macchiarini’s work at KI, where he performed experimental trachea surgery on three patients, Vice-Chancellor Anders Hamsten concluded that the surgeon had not committed misconduct, although some of his work did “not meet the university’s high quality standards in every respect.” But the documentary has raised new concerns by suggesting that Macchiarini misled patients."

Saturday, October 17, 2015

Researchers wrestle with a privacy problem; Nature, 9/22/15

Erika Check Hayden, Nature; Researchers wrestle with a privacy problem:
"But for many social scientists, the most impressive thing was that the authors had been able to examine US federal tax returns: a closely guarded data set that was then available to researchers only with tight restrictions. This has made the study an emblem for both the challenges and the enormous potential power of 'administrative data' — information collected during routine provision of services, including tax returns, records of welfare benefits, data on visits to doctors and hospitals, and criminal records. Unlike Internet searches, social-media posts and the rest of the digital trails that people establish in their daily lives, administrative data cover entire populations with minimal self-selection effects: in the US census, for example, everyone sampled is required by law to respond and tell the truth.
This puts administrative data sets at the frontier of social science, says John Friedman, an economist at Brown University in Providence, Rhode Island, and one of the lead authors of the education study1. “They allow researchers to not just get at old questions in a new way,” he says, “but to come at problems that were completely impossible before.”...
But there is also concern that the rush to use these data could pose new threats to citizens' privacy. “The types of protections that we're used to thinking about have been based on the twin pillars of anonymity and informed consent, and neither of those hold in this new world,” says Julia Lane, an economist at New York University. In 2013, for instance, researchers showed that they could uncover the identities of supposedly anonymous participants in a genetic study simply by cross-referencing their data with publicly available genealogical information."