Showing posts with label human subjects research. Show all posts
Showing posts with label human subjects research. Show all posts

Friday, June 14, 2024

Ethical considerations for the age of non-governmental space exploration; Nature, June 11, 2024

Nature; Ethical considerations for the age of non-governmental space exploration

"Abstract

Mounting ambitions and capabilities for public and private, non-government sector crewed space exploration bring with them an increasingly diverse set of space travelers, raising new and nontrivial ethical, legal, and medical policy and practice concerns which are still relatively underexplored. In this piece, we lay out several pressing issues related to ethical considerations for selecting space travelers and conducting human subject research on them, especially in the context of non-governmental and commercial/private space operations."

Sunday, March 31, 2024

THE RECKONING; Science, March 7, 2024

CATHLEEN O’GRADY , Science; THE RECKONING

"Part of the failure lies with France’s law on research ethics, Amiel says, which is out of step with international standards. “It’s provincial,” he says. “And it’s really a problem.” Because the law allows some human studies to proceed without ethical approval, Amiel says, similar violations are ongoing elsewhere in France, though not at the scale of the IHU’s. The best solution would be to overhaul the law, he says—but “I don’t think it’s a priority for the government at the moment.”

The close relationship between political powers and scientific institutions in France is also to blame for the foot-dragging institutional response, Lacombe says. Without external voices—like Bik, Frank, Besançon, Molimard, and Garcia—“I’m not sure that things would have moved,” she says."

Thursday, February 15, 2024

NIST Researchers Suggest Historical Precedent for Ethical AI Research; NIST, February 15, 2024

NIST ; NIST Researchers Suggest Historical Precedent for Ethical AI Research

"If we train artificial intelligence (AI) systems on biased data, they can in turn make biased judgments that affect hiring decisions, loan applications and welfare benefits — to name just a few real-world implications. With this fast-developing technology potentially causing life-changing consequences, how can we make sure that humans train AI systems on data that reflects sound ethical principles? 

A multidisciplinary team of researchers at the National Institute of Standards and Technology (NIST) is suggesting that we already have a workable answer to this question: We should apply the same basic principles that scientists have used for decades to safeguard human subjects research. These three principles — summarized as “respect for persons, beneficence and justice” — are the core ideas of 1979’s watershed Belmont Report, a document that has influenced U.S. government policy on conducting research on human subjects.

The team has published its work in the February issue of IEEE’s Computer magazine , a peer-reviewed journal. While the paper is the authors’ own work and is not official NIST guidance, it dovetails with NIST’s larger effort to support the development of trustworthy and responsible AI. 

“We looked at existing principles of human subjects research and explored how they could apply to AI,” said Kristen Greene, a NIST social scientist and one of the paper’s authors. “There’s no need to reinvent the wheel. We can apply an established paradigm to make sure we are being transparent with research participants, as their data may be used to train AI.”

The Belmont Report arose from an effort to respond to unethical research studies, such as the Tuskegee syphilis study, involving human subjects. In 1974, the U.S. created the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research, and it identified the basic ethical principles for protecting people in research studies. A U.S. federal regulation later codified these principles in 1991’s Common Rule, which requires that researchers get informed consent from research participants. Adopted by many federal departments and agencies, the Common Rule was revised in 2017 to take into account changes and developments in research."

Thursday, May 20, 2021

A Little-Known Statute Compels Medical Research Transparency. Compliance Is Pretty Shabby.; On The Media, April 21, 2021

On The Media; A Little-Known Statute Compels Medical Research Transparency. Compliance Is Pretty Shabby.

"Evidence-based medicine requires just that: evidence. Access to the collective pool of knowledge produced by clinical trials is what allows researchers to safely and effectively design future studies. It's what allows doctors to make the most informed decisions for their patients.

Since 2007, researchers have been required by law to publish the findings of any clinical trial with human subjects within a year of the trial's conclusion. Over a decade later, even the country's most well-renown research institutions sport poor reporting records. This week, Bob spoke with Charles Piller, an investigative journalist at Science Magazine who's been documenting this dismal state of affairs since 2015. He recently published an op-ed in the New York Times urging President Biden to make good on his 2016 "promise" to start withholding funds to force compliance."

Saturday, May 8, 2021

Top prizes in ethics cartooning contest address COVID-19 and more; Morgridge Institute for Research, May 6, 2021

Mariel Mohns, Morgridge Institute for Research; Top prizes in ethics cartooning contest address COVID-19 and more

"Five prizes were awarded in the fourth annual Morgridge Institute for Research Ethics Cartooning Contest, which invites participants to make a cartoon on any ethical issue related to biomedical research. The competition drew 56 entrants from 35 different departments and programs at the University of Wisconsin-Madison and affiliated research institutions.

A panel of judges applied the following criteria to the competition: depiction and analysis of a research ethics issue, humor, and artistry. A popular vote by the public also contributed to the results. The following winners were selected:

  • First Prize: Alyssa Wiener, School of Medicine and Public Health
  • Second Prize: Vivian Hsiao and Madhuri Nishtala, School of Medicine and Public Health
  • Third Prize: Anjalie Schlaeppi, Morgridge Institute for Research
  • Honorable Mentions: Da-Inn Lee, Wisconsin Institute for Discovery; Noah Trapp, School of Medicine and Public Health

Alyssa Wiener, a first-year postdoctoral research fellow and general surgery resident at the University of Wisconsin School of Medicine and Public Health, took the top prize.

“Bioethics comes up a lot in my day-to-day work,” says Wiener, who does human subjects research in her postdoc. “Being involved clinically also demands bioethical consideration, because what is ‘right’ for a patient, population, or system is often not straightforward.”

Wiener’s winning cartoon explores the ethical and existential challenge of communicating scientific findings to society at large in order to effect practical change.

“This challenge can sometimes escalate to the proportions of an ‘epic battle’ with tremendous collateral damage, as I think is the case with the COVID-19 pandemic response,” says Wiener. “Just as comics function on both an emotional and intellectual level, I hope we can communicate the scientific process and research findings in an impactful but accurate manner.”

The Morgridge Ethics Cartooning Competition, developed by Morgridge Bioethics Scholar in Residence Pilar Ossorio, encourages scientists to shed light on timely or recurring issues that arise in scientific research.

“Ethical issues are all around us,” says Ossorio. “An event like the competition encourages people to identify some of those issues, perhaps talk about them with friends and colleagues, and think about how to communicate about those issues with a broader community of people.”

The COVID-19 pandemic served as a major influence on the competition this year, with many submissions focused on COVID-related topics. Many researchers needed to reassess their day-to-day engagement with ethics issues as they worked remotely away from colleagues and the university research environment."

Wednesday, September 4, 2019

The Ethics of Hiding Your Data From the Machines; Wired, August 22, 2019

Molly Wood, Wired;

The Ethics of Hiding Your Data From the Machines


"In the case of the company I met with, the data collection they’re doing is all good. They want every participant in their longitudinal labor study to opt in, and to be fully informed about what’s going to happen with the data about this most precious and scary and personal time in their lives.

But when I ask what’s going to happen if their company is ever sold, they go a little quiet."

Thursday, January 31, 2019

Recent events highlight an unpleasant scientific practice: ethics dumping; The Economist, January 31, 2019

The Economist; Recent events highlight an unpleasant scientific practice: ethics dumping

Rich-world scientists conduct questionable experiments in poor countries

"Ethics dumping is the carrying out by researchers from one country (usually rich, and with strict regulations) in another (usually less well off, and with laxer laws) of an experiment that would not be permitted at home, or of one that might be permitted, but in a way that would be frowned on. The most worrisome cases involve medical research, in which health, and possibly lives, are at stake. But other investigations—anthropological ones, for example—may also be carried out in a more cavalier fashion abroad. As science becomes more international the risk of ethics dumping, both intentional and unintentional, has risen. The suggestion in this case is that Dr He was encouraged and assisted in his project by a researcher at an American university."

Thursday, March 24, 2016

In N.F.L., Deeply Flawed Concussion Research and Ties to Big Tobacco; New York Times, 3/24/16

Alan Schwarz, Walt Bogdanich, Jacqueline Williams, New York Times; In N.F.L., Deeply Flawed Concussion Research and Ties to Big Tobacco:
"With several of its marquee players retiring early after a cascade of frightening concussions, the league formed a committee in 1994 that would ultimately issue a succession of research papers playing down the danger of head injuries. Amid criticism of the committee’s work, physicians brought in later to continue the research said the papers had relied on faulty analysis.
Now, an investigation by The New York Times has found that the N.F.L.’s concussion research was far more flawed than previously known.
For the last 13 years, the N.F.L. has stood by the research, which, the papers stated, was based on a full accounting of all concussions diagnosed by team physicians from 1996 through 2001. But confidential data obtained by The Times shows that more than 100 diagnosed concussions were omitted from the studies — including some severe injuries to stars like quarterbacks Steve Young and Troy Aikman. The committee then calculated the rates of concussions using the incomplete data, making them appear less frequent than they actually were.
After The Times asked the league about the missing diagnosed cases — more than 10 percent of the total — officials acknowledged that “the clubs were not required to submit their data and not every club did.” That should have been made clearer, the league said in a statement, adding that the missing cases were not part of an attempt “to alter or suppress the rate of concussions.”
One member of the concussion committee, Dr. Joseph Waeckerle, said he was unaware of the omissions. But he added: “If somebody made a human error or somebody assumed the data was absolutely correct and didn’t question it, well, we screwed up. If we found it wasn’t accurate and still used it, that’s not a screw-up; that’s a lie.”
These discoveries raise new questions about the validity of the committee’s findings, published in 13 peer-reviewed articles and held up by the league as scientific evidence that brain injuries did not cause long-term harm to its players. It is also unclear why the omissions went unchallenged by league officials, by the epidemiologist whose job it was to ensure accurate data collection and by the editor of the medical journal that published the studies."

Wednesday, February 17, 2016

Balancing Benefits and Risks of Immortal Data Participants’ Views of Open Consent in the Personal Genome Project; Hastings Center Report, 12/17/15

Oscar A. Zarate, Julia Green Brody, Phil Brown, Monica D. Ramirez-Andreotta, Laura Perovich andJacob Matz, Hastings Center Report; Balancing Benefits and Risks of Immortal Data: Participants’ Views of Open Consent in the Personal Genome Project:
"Abstract
An individual's health, genetic, or environmental-exposure data, placed in an online repository, creates a valuable shared resource that can accelerate biomedical research and even open opportunities for crowd-sourcing discoveries by members of the public. But these data become “immortalized” in ways that may create lasting risk as well as benefit. Once shared on the Internet, the data are difficult or impossible to redact, and identities may be revealed by a process called data linkage, in which online data sets are matched to each other. Reidentification (re-ID), the process of associating an individual's name with data that were considered deidentified, poses risks such as insurance or employment discrimination, social stigma, and breach of the promises often made in informed-consent documents. At the same time, re-ID poses risks to researchers and indeed to the future of science, should re-ID end up undermining the trust and participation of potential research participants.
The ethical challenges of online data sharing are heightened as so-called big data becomes an increasingly important research tool and driver of new research structures. Big data is shifting research to include large numbers of researchers and institutions as well as large numbers of participants providing diverse types of data, so the participants’ consent relationship is no longer with a person or even a research institution. In addition, consent is further transformed because big data analysis often begins with descriptive inquiry and generation of a hypothesis, and the research questions cannot be clearly defined at the outset and may be unforeseeable over the long term. In this article, we consider how expanded data sharing poses new challenges, illustrated by genomics and the transition to new models of consent. We draw on the experiences of participants in an open data platform—the Personal Genome Project—to allow study participants to contribute their voices to inform ethical consent practices and protocol reviews for big-data research."

Friday, January 29, 2016

Karolinska Institute may reopen ethics inquiry into work of pioneering surgeon; Science, 1/29/16

Gretchen Vogel, Science; Karolinska Institute may reopen ethics inquiry into work of pioneering surgeon:
"A documentary on Swedish Television (SVT) has prompted the Karolinska Institute (KI) in Stockholm to consider reopening its investigation into possible misconduct by surgeon Paolo Macchiarini. After an investigation last year into Macchiarini’s work at KI, where he performed experimental trachea surgery on three patients, Vice-Chancellor Anders Hamsten concluded that the surgeon had not committed misconduct, although some of his work did “not meet the university’s high quality standards in every respect.” But the documentary has raised new concerns by suggesting that Macchiarini misled patients."

Wednesday, May 27, 2015

The University of Minnesota’s Medical Research Mess; New York Times, 5/26/15

Carl Elliott, New York Times; The University of Minnesota’s Medical Research Mess:
"These days, of course, medical research is not just a scholarly affair. It is also a global, multibillion-dollar business enterprise, powered by the pharmaceutical and medical-device industries. The ethical problem today is not merely that these corporations have plenty of money to grease the wheels of university research. It’s also that researchers themselves are often given powerful financial incentives to do unethical things: pressure vulnerable subjects to enroll in studies, fudge diagnoses to recruit otherwise ineligible subjects and keep subjects in studies even when they are doing poorly.
In what other potentially dangerous industry do we rely on an honor code to keep people safe? Imagine if inspectors never actually set foot in meatpacking plants or coal mines, but gave approvals based entirely on paperwork filled out by the owners.
With so much money at stake in drug research, research subjects need a full-blown regulatory system. I.R.B.s should be replaced with oversight bodies that are fully independent — both financially and institutionally — of the research they are overseeing. These bodies must have the staffing and the authority to monitor research on the ground. And they must have the power to punish researchers who break the rules and institutions that cover up wrongdoing."

Thursday, February 5, 2015

A Failed Trial in Africa Raises Questions About How to Test H.I.V. Drugs; New York Times, 2/4/15

Donald G. McNeil Jr., New York Times; A Failed Trial in Africa Raises Questions About How to Test H.I.V. Drugs:
"The surprising failure of a large clinical trial of H.I.V.-prevention methods in Africa — and the elaborate deceptions employed by the women in it — have opened an ethical debate about how to run such studies in poor countries and have already changed the design of some that are now underway."

Thursday, July 3, 2014

Did Facebook's experiment violate ethics?; CNN, 7/2/14

Robert Klitzman, CNN; Did Facebook's experiment violate ethics? :
"Editor's note: Robert Klitzman is a professor of psychiatry and director of the Masters of Bioethics Program at Columbia University. He is author of the forthcoming book, "The Ethics Police?: The Struggle to Make Human Research Safe." The opinions expressed in this commentary are solely those of the author...
In 1974, following revelations of ethical violations in the Tuskegee Syphilis study, Congress passed the National Research Act. At Tuskegee, researchers followed African-American men with syphilis for decades and did not tell the subjects when penicillin became available as an effective treatment. The researchers feared that the subjects, if informed, would take the drug and be cured, ending the experiment.
Public outcry led to federal regulations governing research on humans, requiring informed consent. These rules pertain, by law, to all studies conducted using federal funds, but have been extended by essentially all universities and pharmaceutical and biotech companies in this country to cover all research on humans, becoming the universally-accepted standard.
According to these regulations, all research must respect the rights of individual research subjects, and scientific investigators must therefore explain to participants the purposes of the study, describe the procedures (and which of these are experimental) and "any reasonably foreseeable risks or discomforts."
Facebook followed none of these mandates. The company has argued that the study was permissible because the website's data use policy states, "we may use the information we receive about you...for internal operations, including troubleshooting, data analysis, testing, research and service improvement," and that "we may make friend suggestions, pick stories for your News Feed or suggest people to tag in photos."
But while the company is not legally required to follow this law, two of the study's three authors are affiliated with universities -- Cornell and the University of California at San Francisco -- that publicly uphold this standard."

Wednesday, July 2, 2014

Facebook experiment may have broken UK law; Aljazeera, 7/2/14

Aljazeera; Facebook experiment may have broken UK law:
"A British data regulator has been investigating whether Facebook Inc broke data protection laws when it allowed researchers to conduct a psychological experiment on nearly 700,000 users of the social network, the Financial Times reported.
The Information Commissioner's Office (ICO), which monitors how personal data is used, is probing the controversial experiment and plans to ask Facebook questions, the newspaper reported on Tuesday."

Should Facebook Manipulate Users?: Jaron Lanier on Lack of Transparency in Facebook Study; New York Times, 6/30/14

Jaron Lanier, New York Times; Should Facebook Manipulate Users?: Jaron Lanier on Lack of Transparency in Facebook Study:
"Research with human subjects is generally governed by strict ethical standards, including the informed consent of the people who are studied. Facebook’s generic click-through agreement, which almost no one reads and which doesn’t mention this kind of experimentation, was the only form of consent cited in the paper. The subjects in the study still, to this day, have not been informed that they were in the study. If there had been federal funding, such a complacent notion of informed consent would probably have been considered a crime. Subjects would most likely have been screened so that those at special risk would be excluded or handled with extra care.
This is only one early publication about a whole new frontier in the manipulation of people, and Facebook shouldn’t be singled out as a villain. All researchers, whether at universities or technology companies, need to focus more on the ethics of how we learn to improve our work.
To promote the relevance of their study, the researchers noted that emotion was relevant to human health, and yet the study didn’t measure any potential health effects of the controlled manipulation of emotions."

Facebook Study Sparks Soul-Searching and Ethical Questions: Incident Shines Light on How Companies, Researchers Tap Data Created Online; Wall Street Journal, 6/30/14

Reed Albergotti and Elizabeth Dwoskin, Wall Street Journal; Facebook Study Sparks Soul-Searching and Ethical Questions: Incident Shines Light on How Companies, Researchers Tap Data Created Online:

Tuesday, July 1, 2014

Facebook Added 'Research' To User Agreement 4 Months After Emotion Manipulation Study; Forbes, 6/30/14

Kashmir Hill, Forbes; Facebook Added 'Research' To User Agreement 4 Months After Emotion Manipulation Study:
"Unless you’ve spent the last couple of days in a Faraday pouch under a rock, you’ve heard about Facebook’s controversial ‘emotion manipulation’ study. Facebook data scientist Adam Kramer ran an experiment on 689,003 Facebook users two and a half years ago to find out whether emotions were contagious on the social network. It lasted for a week in January 2012. It came to light recently when he and his two co-researchers from Cornell University and University of California-SF published their study describing how users’ moods changed when Facebook curated the content of their News Feeds to highlight the good, happy stuff (for the lucky group) vs. the negative, depressing stuff (for the unlucky and hopefully-not-clinically-depressed group). The idea of Facebook manipulating users’ emotions for science — without telling them or explicitly asking them first — rubbed many the wrong way. Critics said Facebook should get “informed consent” for a study like this — asking people if they’re okay being in a study and then telling them what was being studied afterwards. Defenders said, “Hey, the Newsfeed gets manipulated all the time. What’s the big deal?” Critics and defenders alike pointed out that Facebook’s “permission” came from its Data Use Policy which among its thousands of words informs people that their information might be used for “internal operations,” including “research.” However, we were all relying on what Facebook’s data policy says now. In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that “research” is something that might happen on the platform.
Four months after this study happened, in May 2012, Facebook made changes to its data use policy, and that’s when it introduced this line about how it might use your information: “For internal operations, including troubleshooting, data analysis, testing, research and service improvement.”"