Showing posts with label researchers. Show all posts
Showing posts with label researchers. Show all posts

Tuesday, April 23, 2024

New Group Joins the Political Fight Over Disinformation Online; The New York Times, April 22, 2024

 Steven Lee Myers and , The New York Times; New Group Joins the Political Fight Over Disinformation Online

"Many of the nation’s most prominent researchers, facing lawsuits, subpoenas and physical threats, have pulled back.

“More and more researchers were getting swept up by this, and their institutions weren’t either allowing them to respond or responding in a way that really just was not rising to meet the moment,” Ms. Jankowicz said in an interview. “And the problem with that, obviously, is that if we don’t push back on these campaigns, then that’s the prevailing narrative.”

That narrative is prevailing at a time when social media companies have abandoned or cut back efforts to enforce their own policies against certain types of content.

Many experts have warned that the problem of false or misleading content is only going to increase with the advent of artificial intelligence.

“Disinformation will remain an issue as long as the strategic gains of engaging in it, promoting it and profiting from it outweigh consequences for spreading it,” Common Cause, the nonpartisan public interest group, wrote in a report published last week that warned of a new wave of disinformation around this year’s vote."

Monday, March 25, 2024

Judge dismisses Elon Musk's suit against hate speech researchers; NPR, March 25, 2024

 , NPR; Judge dismisses Elon Musk's suit against hate speech researchers

"A federal judge has dismissed owner Elon Musk's lawsuit against a research group that documented an uptick in hate speech on the social media site, saying the organization's reports on the platform formerly known as Twitter were protected by the First Amendment. 

Musk's suit, "is so unabashedly and vociferously about one thing that there can be no mistaking that purpose," wrote U.S. District Judge Charles Breyer in his Monday ruling, "This case is about punishing the Defendants for their speech."

Amid an advertiser boycott of X last year, Musk sued the research and advocacy organization Center for Countering Digital Hate, alleging it violated the social media site's terms of service in gathering data for its reports."

Thursday, March 21, 2024

Canada moves to protect coral reef that scientists say ‘shouldn’t exist’; The Guardian, March 15, 2024

, The Guardian; Canada moves to protect coral reef that scientists say ‘shouldn’t exist’

"For generations, members of the Kitasoo Xai’xais and Heiltsuk First Nations, two communities off the Central Coast region of British Columbia, had noticed large groups of rockfish congregating in a fjord system.

In 2021, researchers and the First Nations, in collaboration with the Canadian government, deployed a remote-controlled submersible to probe the depths of the Finlayson Channel, about 300 miles north-west of Vancouver.

On the last of nearly 20 dives, the team made a startling discovery – one that has only recently been made public...

The discovery marks the latest in a string of instances in which Indigenous knowledge has directed researchers to areas of scientific or historic importance. More than a decade ago, Inuk oral historian Louie Kamookak compared Inuit stories with explorers’ logbooks and journals to help locate Sir John Franklin’s lost ships, HMS Erebus and HMS Terror. In 2014, divers located the wreck of the Erebus in a spot Kamookak suggested they search, and using his directions found the Terror two years later."

Friday, October 13, 2023

Researchers use AI to read word on ancient scroll burned by Vesuvius; The Guardian, October 12, 2023

 , The Guardian; Researchers use AI to read word on ancient scroll burned by Vesuvius

"When the blast from the eruption of Mount Vesuvius reached Herculaneum in AD79, it burned hundreds of ancient scrolls to a crisp in the library of a luxury villa and buried the Roman town in ash and pumice.

The disaster appeared to have destroyed the scrolls for good, but nearly 2,000 years later researchers have extracted the first word from one of the texts, using artificial intelligence to peer deep inside the delicate, charred remains.

The discovery was announced on Thursday by Prof Brent Seales, a computer scientist at the University of Kentucky, and others who launched the Vesuvius challenge in March to accelerate the reading of the texts. Backed by Silicon Valley investors, the challenge offers cash prizes to researchers who extract legible words from the carbonised scrolls." 

Thursday, August 3, 2023

Is facial recognition identifying you? Are there ‘dog whistles’ in ChatGPT? Ethics in artificial intelligence gets unpacked; Northeastern Global News, August 3, 2023

 , Northeastern Global News; Is facial recognition identifying you? Are there ‘dog whistles’ in ChatGPT? Ethics in artificial intelligence gets unpacked

"The graduate-level program at Northeastern is designed to teach researchers how to examine artificial intelligence and data systems through an ethical framework. The course is conducted by the Ethics Institute, an interdisciplinary effort supported by the Office of the Provost, the College of Social Sciences and Humanities (CSSH) and the Department of Philosophy and Religion...

The aim of the course was to both provide students with some background on the technical components underpinning these systems as well as the frameworks used to adequately analyze their ethical impact. 

Throughout the seminar, students each day were tasked with providing oral arguments based on the day’s reading. Each student was also tasked with developing an original thesis around the topic of discussion and presented it the final week of class. 

One central topic of discussion was algorithmic fairness, Creel says."  

Thursday, July 13, 2023

RFK Jr. is building a presidential campaign around conspiracy theories; NPR, July 13, 2023

 , NPR; RFK Jr. is building a presidential campaign around conspiracy theories

"What's not up for debate for scientists, researchers and public health officials is Kennedy's long track record of undermining science and spreading dubious claims.

"He has an enormous platform. He is going to, over the next many months, do a series of town hall meetings where he will continue to put bad information out there that will cause people to make bad decisions for themselves and their families, again putting children at risk and causing children to suffer," Offit said. "Because it's always the most vulnerable among us who suffer our ignorance.""

Tuesday, June 20, 2023

G.O.P. Targets Researchers Who Study Disinformation Ahead of 2024 Election; The New York Times, June 19, 2023

Steven Lee Myers and  , The New York Times; G.O.P. Targets Researchers Who Study Disinformation Ahead of 2024 Election

"On Capitol Hill and in the courts, Republican lawmakers and activists are mounting a sweeping legal campaign against universities, think tanks and private companies that study the spread of disinformation, accusing them of colluding with the government to suppress conservative speech online."

Saturday, April 29, 2023

Editors quit top neuroscience journal to protest against open-access charges; Nature, April 21, 2023

 Katharine Sanderson, Nature; Editors quit top neuroscience journal to protest against open-access charges

"More than 40 editors have resigned from two leading neuroscience journals in protest against what the editors say are excessively high article-processing charges (APCs) set by the publisher. They say that the fees, which publishers use to cover publishing services and in some cases make money, are unethical. The publisher, Dutch company Elsevier, says that its fees provide researchers with publishing services that are above average quality for below average price. The editors plan to start a new journal hosted by the non-profit publisher MIT Press.

The decision to resign came about after many discussions among the editors, says Stephen Smith, a neuroscientist at the University of Oxford, UK, and editor-in-chief of one of the journals, NeuroImage. “Everyone agreed that the APC was unethical and unsustainable,” says Smith, who will lead the editorial team of the new journal, Imaging Neuroscience, when it launches.

The 42 academics who made up the editorial teams at NeuroImage and its companion journal NeuroImage: Reports announced their resignations on 17 April. The journals are open access and require authors to pay a fee for publishing services. The APC for NeuroImage is US$3,450; NeuroImage: Reports charges $900, which will double to $1,800 from 31 May. Elsevier, based in Amsterdam, says that the APCs cover the costs associated with publishing an article in an open-access journal, including editorial and peer-review services, copyediting, typesetting archiving, indexing, marketing and administrative costs. Andrew Davis, Elsevier’s vice-president of corporate communications, says that NeuroImage’s fee is less than that of the nearest comparable journal in its field, and that the publisher’s APCs are “set in line with our policy [of] providing above average quality for below average price”."

Tuesday, March 7, 2023

WHO kicks off deliberations on ethical framework and tools for social listening and infodemic management; World Health Organization (WHO), February 10, 2023

World Health Organization (WHO) ; WHO kicks off deliberations on ethical framework and tools for social listening and infodemic management

"WHO has convened a panel of experts to discuss ethical considerations in social listening and infodemic management. The aim of the ethics expert panel is to reach a consensus on ethical principles for social listening and other infodemic management activities and provide recommendations for health authorities and researchers.

The panel brings together experts from academia, health authorities, and civil society, with a wide range of expertise such as in biomedical ethics, data privacy, law, digital sociology, digital health, epidemiology, health communication, health promotion, and media studies.

An infodemic is an overabundance information, including misinformation, that surges during a health emergency. During a health emergency, people seek, receive, process and act on information differently than in other times, which makes it even more important to use evidence-based strategies in response. Infodemic management practice, underpinned by the science of infodemiology, has rapidly evolved in the recent years. Tools and experience that were developed during COVID-19 pandemic response have already been applied to other outbreaks, such as ebola, polio and cholera. 

Social listening in public health is the process of gathering information about people's questions, concerns, and circulating narratives and misinformation about health from online and offline data sources. Data gleaned from social media platforms are being used in a number ways to identify and understand outbreaks, geographic and demographic trends, networks, sentiment and behavioral responses to public health emergencies. Offline data collection may include rapid surveys, townhalls, or interviews with people in vulnerable groups, communities of focus and specific populations. These data are then integrated with other data sources from the health system (such as health information systems) and outside of it (mobility data) to generate infodemic insights and inform strategies to manage infodemics.

However, the collection and use of this data presents ethical challenges, such as privacy and consent, and there is currently no agreed-upon ethical framework for social listening and infodemic management. 

The panel will focus on issues such as data control, commercialization, transparency, and accountability, and will consider ethical guidelines for both online and offline data collection, analysis and reporting. The goal is to develop an ethical framework for social listening and infodemic management to guide health authorities when planning and standing up infodemic insights teams and activities, as well as for practitioners when planning and implementing social listening and infodemic management."

Monday, May 30, 2022

Nature addresses helicopter research and ethics dumping; Nature, June 2, 2022

Nature; Nature addresses helicopter research and ethics dumping

"Exploitative research practices, sadly, come in all shapes and sizes. ‘Helicopter research’ occurs when researchers from high-income settings, or who are otherwise privileged, conduct studies in lower-income settings or with groups who are historically marginalized, with little or no involvement from those communities or local researchers in the con- ceptualization, design, conduct or publication of the research. ‘Ethics dumping’ occurs when similarly privileged researchers export unethical or unpalatable experiments and studies to lower-income or less-privileged settings with different ethical standards or less oversight."

Monday, March 7, 2022

Opinion: Genomics’ Ethical Gray Areas Are Harming the Developing World; Undark, February 24, 2022

DYNA ROCHMYANINGSIH, Undark; Opinion: Genomics’ Ethical Gray Areas Are Harming the Developing World

"Various ethics guidelines on health-related research — including UNESCO’s International Declaration on Human Genetic Data and international ethical guidelines published by the Council for International Organizations of Medical Sciences, or CIOMS, in collaboration with the World Health Organization — advise researchers to seek approval from an ethics committee in the host country. Such reviews are critical, bioethicists say, because cultural and social considerations of research ethics might vary between countries. In low-resource countries especially, ethics reviews are essential to protect the interests of participants and ensure that data are used in ways that benefit local communities.

Nowhere in Larena and Jakobsson’s paper, or in any of the subsequent publications based on the Philippines study, does the Uppsala team mention obtaining such an ethics approval in the Philippines — and Philippines officials say they never granted the team such an approval."

Tuesday, March 1, 2022

How to protect the first ‘CRISPR babies’ prompts ethical debate; Nature, February 25, 2022

Smriti Mallapaty, Nature; How to protect the first ‘CRISPR babies’ prompts ethical debate

"Two prominent bioethicists in China are calling on the government to set up a research centre dedicated to ensuring the well-being of the first children born with edited genomes. Scientists have welcomed the discussion, but many are concerned that the pair’s approach would lead to unnecessary surveillance of the children.

The proposal comes ahead of the possibly imminent release from prison of He Jiankui, the researcher who in 2018 shocked the world by announcing that he had created babies with altered genomes. He’s actions were widely condemned by scientists around the world, who called for a global moratorium on editing embryos destined for implantation. Several ethics committees have since concluded that the technology should not be used to make changes that can be passed on."

Saturday, February 26, 2022

World's first octopus farm stirs ethical debate; Reuters, February 23, 2022

Nathan Allen and Guillermo Martinez , Reuters; World's first octopus farm stirs ethical debate

"Since the 2020 documentary "My Octopus Teacher" captured the public imagination with its tale of a filmmaker's friendship with an octopus, concern for their wellbeing has grown.

Last year, researchers at the London School of Economics concluded from a review of 300 scientific studies that octopus were sentient beings capable of experiencing distress and happiness, and that high-welfare farming would be impossible.

Raul Garcia, who heads the WWF conservation organisation's fisheries operations in Spain, agrees.

"Octopuses are extremely intelligent and extremely curious. And it's well known they are not happy in conditions of captivity," he told Reuters."

Friday, February 18, 2022

The government dropped its case against Gang Chen. Scientists still see damage done; WBUR, February 16, 2022

Max Larkin, WBUR ; The government dropped its case against Gang Chen. Scientists still see damage done

"When federal prosecutors dropped all charges against MIT professor Gang Chen in late January, many researchers rejoiced in Greater Boston and beyond.

Chen had spent the previous year fighting charges that he had lied and omitted information on U.S. federal grant applications. His vindication was a setback for the "China Initiative," a controversial Trump-era legal campaign aimed at cracking down on the theft of American research and intellectual property by the Chinese government.

Researchers working in the United States say the China Initiative has harmed both their fellow scientists and science itself — as a global cooperative endeavor. But as U.S.-China tensions remain high, the initiative remains in place." 

Saturday, February 5, 2022

Two members of Google’s Ethical AI group leave to join Timnit Gebru’s nonprofit; The Verge, February 2, 2022

Emma Roth, The Verge; Two members of Google’s Ethical AI group leave to join Timnit Gebru’s nonprofit

"Two members of Google’s Ethical AI group have announced their departures from the company, according to a report from Bloomberg. Senior researcher Alex Hanna, and software engineer Dylan Baker, will join Timnit Gebru’s nonprofit research institute, Distributed AI Research (DAIR)...

In a post announcing her resignation on Medium, Hanna criticizes the “toxic” work environment at Google, and draws attention to a lack of representation of Black women at the company."

Friday, February 4, 2022

IRS plan to scan your face prompts anger in Congress, confusion among taxpayers; The Washington Post, January 27, 2022

Drew Harwell, The Washington Post; IRS plan to scan your face prompts anger in Congress, confusion among taxpayers

"The $86 million ID.me contract with the IRS also has alarmed researchers and privacy advocates who say they worry about how Americans’ facial images and personal data will be safeguarded in the years to come. There is no federal law regulating how the data can be used or shared. While the IRS couldn’t say what percentage of taxpayers use the agency’s website, internal data show it is one of the federal government’s most-viewed websites, with more than 1.9 billion visits last year."

Sunday, January 30, 2022

Massive open index of scholarly papers launches; Nature, January 24, 2022

 Dalmeet Singh Chawla , Nature; Massive open index of scholarly papers launches

"An ambitious free index of more than 200 million scientific documents that catalogues publication sources, author information and research topics, has been launched.

The index, called OpenAlex after the ancient Library of Alexandria in Egypt, also aims to chart connections between these data points to create a comprehensive, interlinked database of the global research system, say its founders. The database, which launched on 3 January, is a replacement for Microsoft Academic Graph (MAG), a free alternative to subscription-based platforms such as Scopus, Dimensions and Web of Science that was discontinued at the end of 2021."

Thursday, May 20, 2021

A Little-Known Statute Compels Medical Research Transparency. Compliance Is Pretty Shabby.; On The Media, April 21, 2021

On The Media; A Little-Known Statute Compels Medical Research Transparency. Compliance Is Pretty Shabby.

"Evidence-based medicine requires just that: evidence. Access to the collective pool of knowledge produced by clinical trials is what allows researchers to safely and effectively design future studies. It's what allows doctors to make the most informed decisions for their patients.

Since 2007, researchers have been required by law to publish the findings of any clinical trial with human subjects within a year of the trial's conclusion. Over a decade later, even the country's most well-renown research institutions sport poor reporting records. This week, Bob spoke with Charles Piller, an investigative journalist at Science Magazine who's been documenting this dismal state of affairs since 2015. He recently published an op-ed in the New York Times urging President Biden to make good on his 2016 "promise" to start withholding funds to force compliance."

Tuesday, April 27, 2021

AI unlocks ancient Dead Sea Scrolls mystery; BBC News, April 22, 2021

 BBC News; AI unlocks ancient Dead Sea Scrolls mystery

"Researchers at the University of Groningen in the Netherlands examined the Isaiah scroll using "cutting edge" pattern recognition and AI. They analysed a single Hebrew letter, aleph, which appears more than 5,000 times in the scroll."

Friday, July 17, 2020

If AI is going to help us in a crisis, we need a new kind of ethics; MIT Technology Review, June 24, 2020

, MIT Technology Review; If AI is going to help us in a crisis, we need a new kind of ethics

Ethics for urgency means making ethics a core part of AI rather than an afterthought, says Jess Whittlestone.

"What needs to change?

We need to think about ethics differently. It shouldn’t be something that happens on the side or afterwards—something that slows you down. It should simply be part of how we build these systems in the first place: ethics by design...

You’ve said that we need people with technical expertise at all levels of AI design and use. Why is that?

I’m not saying that technical expertise is the be-all and end-all of ethics, but it’s a perspective that needs to be represented. And I don’t want to sound like I’m saying all the responsibility is on researchers, because a lot of the important decisions about how AI gets used are made further up the chain, by industry or by governments.

But I worry that the people who are making those decisions don’t always fully understand the ways it might go wrong. So you need to involve people with technical expertise. Our intuitions about what AI can and can’t do are not very reliable.

What you need at all levels of AI development are people who really understand the details of machine learning to work with people who really understand ethics. Interdisciplinary collaboration is hard, however. People with different areas of expertise often talk about things in different ways. What a machine-learning researcher means by privacy may be very different from what a lawyer means by privacy, and you can end up with people talking past each other. That’s why it’s important for these different groups to get used to working together."