Showing posts with label trust. Show all posts
Showing posts with label trust. Show all posts

Thursday, October 3, 2024

X’s AI chatbot spread voter misinformation – and election officials fought back; The Guardian, September 12, 2024

Rachel Leingang, The Guardian; X’s AI chatbot spread voter misinformation – and election officials fought back


"Finding the source – and working to correct it – served as a test case of how election officials and artificial intelligence companies will interact during the 2024 presidential election in the US amid fears that AI could mislead or distract voters. And it showed the role Grok, specifically, could play in the election, as a chatbot with fewer guardrails to prevent the generating of more inflammatory content.


A group of secretaries of state and the organization that represents them, the National Association of Secretaries of State, contacted Grok and X to flag the misinformation. But the company didn’t work to correct it immediately, instead giving the equivalent of a shoulder shrug, said Steve Simon, the Minnesota secretary of state. “And that struck, I think it’s fair to say all of us, as really the wrong response,” he said.


Thankfully, this wrong answer was relatively low-stakes: it would not have prevented people from casting a ballot. But the secretaries took a strong position quickly because of what could come next.


“In our minds, we thought, well, what if the next time Grok makes a mistake, it is higher stakes?” Simon said. “What if the next time the answer it gets wrong is, can I vote, where do I vote … what are the hours, or can I vote absentee? So this was alarming to us.”


Especially troubling was the fact that the social media platform itself was spreading false information, rather than users spreading misinformation using the platform.


The secretaries took their effort public. Five of the nine secretaries in the group signed on to a public letter to the platform and its owner, Elon Musk. The letter called on X to have its chatbot take a similar position as other chatbot tools, like ChatGPT, and direct users who ask Grok election-related questions to a trusted nonpartisan voting information site, CanIVote.org.


The effort worked. Grok now directs users to a different website, vote.gov, when asked about elections."

Tuesday, September 17, 2024

Disinformation, Trust, and the Role of AI: The Daniel Callahan Annual Lecture; The Hastings Center, September 12, 2024

 The Hastings Center; Disinformation, Trust, and the Role of AI: The Daniel Callahan Annual Lecture

"A Moderated Discussion on DISINFORMATION, TRUST, AND THE ROLE OF AI: Threats to Health & Democracy, The Daniel Callahan Annual Lecture

Panelists: Reed Tuckson, MD, FACP, Chair & Co-Founder of the Black Coalition Against Covid, Chair and Co-Founder of the Coalition For Trust In Health & Science Timothy Caulfield, LB, LLM, FCAHS, Professor, Faculty of Law and School of Public Health, University of Alberta; Best-selling author & TV host Moderator: Vardit Ravitsky, PhD, President & CEO, The Hastings Center"

Wednesday, September 4, 2024

NEH Awards $2.72 Million to Create Research Centers Examining the Cultural Implications of Artificial Intelligence; National Endowment for the Humanities (NEH), August 27, 2024

Press Release, National Endowment for the Humanities (NEH); NEH Awards $2.72 Million to Create Research Centers Examining the Cultural Implications of Artificial Intelligence

"The National Endowment for the Humanities (NEH) today announced grant awards totaling $2.72 million for five colleges and universities to create new humanities-led research centers that will serve as hubs for interdisciplinary collaborative research on the human and social impact of artificial intelligence (AI) technologies.

As part of NEH’s third and final round of grant awards for FY2024, the Endowment made its inaugural awards under the new Humanities Research Centers on Artificial Intelligence program, which aims to foster a more holistic understanding of AI in the modern world by creating scholarship and learning centers across the country that spearhead research exploring the societal, ethical, and legal implications of AI. 

Institutions in California, New York, North Carolina, Oklahoma, and Virginia were awarded NEH grants to establish the first AI research centers and pilot two or more collaborative research projects that examine AI through a multidisciplinary humanities lens. 

The new Humanities Research Centers on Artificial Intelligence grant program is part of NEH’s agencywide Humanities Perspectives on Artificial Intelligence initiative, which supports humanities projects that explore the impacts of AI-related technologies on truth, trust, and democracy; safety and security; and privacy, civil rights, and civil liberties. The initiative responds to President Biden’s Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence, which establishes new standards for AI safety and security, protects Americans’ privacy, and advances equity and civil rights."

Thursday, August 29, 2024

Disinformation, Trust, and the Role of AI: Threats to Health & Democracy; The Hastings Center, September 9, 2024

The Hastings Center; Disinformation, Trust, and the Role of AI: Threats to Health & Democracy

"Join us for The Daniel Callahan Annual Lecture, hosted by The Hastings Center at Rockefeller University’s beautiful campus in New York. Hastings Center President Vardit Ravitsky will moderate a discussion with experts Reed Tuckson and Timothy Caulfieldon disinformation, trust, and the role of AI, focusing on current and future threats to health and democracy. The event will take place on Monday, September 9, 5 pm. Learn more and register...

A Moderated Discussion on DISINFORMATION, TRUST, AND THE ROLE OF AI: Threats to Health & Democracy, The Daniel Callahan Annual Lecture

Panelists
Reed Tuckson, MD, FACP, Chair & Co-Founder of the Black Coalition Against Covid, Chair and Co-Founder of the Coalition For Trust In Health & Science
Timothy Caulfield, LB, LLM, FCAHS, Professor, Faculty of Law and School of Public Health, University of Alberta; Best-selling author & TV host

Moderator:
Vardit Ravitsky, PhD, President & CEO, The Hastings Center"

Tuesday, July 16, 2024

Peter Buxtun, whistleblower who exposed Tuskegee syphilis study, dies aged 86; Associated Press via The Guardian, July 15, 2024

 Associated Press via The Guardian; Peter Buxtun, whistleblower who exposed Tuskegee syphilis study, dies aged 86

"Peter Buxtun, the whistleblower who revealed that the US government allowed hundreds of Black men in rural Alabama to go untreated for syphilis in what became known as the Tuskegee study, has died. He was 86...

Buxtun is revered as a hero to public health scholars and ethicists for his role in bringing to light the most notorious medical research scandal in US history. Documents that Buxtun provided to the Associated Press, and its subsequent investigation and reporting, led to a public outcry that ended the study in 1972.

Forty years earlier, in 1932, federal scientists began studying 400 Black men in Tuskegee, Alabama, who were infected with syphilis. When antibiotics became available in the 1940s that could treat the disease, federal health officials ordered that the drugs be withheld. The study became an observation of how the disease ravaged the body over time...

In his complaints to federal health officials, he drew comparisons between the Tuskegee study and medical experiments Nazi doctors had conducted on Jews and other prisoners. Federal scientists did not believe they were guilty of the same kind of moral and ethical sins, but after the Tuskegee study was exposed, the government put in place new rules about how it conducts medical research. Today, the study is often blamed for the unwillingness of some African Americans to participate in medical research.

“Peter’s life experiences led him to immediately identify the study as morally indefensible and to seek justice in the form of treatment for the men. Ultimately, he could not relent,” said the CDC’s Pestorius."

Monday, June 17, 2024

Video Clip: The Death of Truth; C-Span, June 9, 2024

 C-Span; Video Clip: The Death of Truth

"Steven Brill, a journalist and NewsGuard Co-CEO, talked about his new book on online misinformation and social media, and their impact on U.S. politics and democracy."

Tuesday, June 4, 2024

Sure, Google’s AI overviews could be useful – if you like eating rocks; The Guardian, June 1, 2024

 , The Guardian; Sure, Google’s AI overviews could be useful – if you like eating rocks

"To date, some of this searching suggests subhuman capabilities, or perhaps just human-level gullibility. At any rate, users have been told that glue is useful for ensuring that cheese sticks to pizza, that they could stare at the sun for for up to 30 minutes, and that geologists suggest eating one rock per day (presumably to combat iron deficiency). Memo to Google: do not train your AI on Reddit or the Onion."

Thursday, April 4, 2024

Ethics in an age of disinformation: Free webinar series from the National Press Club Journalism Institute; National Press Club Journalism Institute, April 4, 2024

Press Release, National Press Club Journalism Institute; Ethics in an age of disinformation: Free webinar series from the National Press Club Journalism Institute 

"The National Press Club Journalism Institute is pleased to announce a free, four-part webinar series focused on ethics in the age of disinformation. These discussions are geared toward equipping journalists and the public with tools to combat mis and disinformation efforts aimed at disrupting journalism and democracy.

All of these webinars are free and open to the public and are designed to provide tools and best practices to support ethical, trustworthy journalism."

Friday, August 25, 2023

Who owns your cells? Legacy of Henrietta Lacks raises ethical questions about profits from medical research; Cleveland.com, August 18, 2023

Who owns your cells? Legacy of Henrietta Lacks raises ethical questions about profits from medical research

"While the legal victory may have given the family some closure, it has raised concerns for bioethicists in Cleveland and elsewhere.

The case raises important questions about owning one’s own body; whether individuals are entitled to a share of the profits from medical discoveries derived from research on their own cells, organs and genetic material.

But it also offers a tremendous opportunity to not only acknowledge the ethical failures of the past and the seeds of mistrust they have sown, but to guide society toward building better, more trustworthy medical institutions, said Aaron Goldenberg, who directs the Bioethics Center for Community Health and Genomic Equity (CHANGE) at Case Western Reserve University."

Friday, August 11, 2023

A New Frontier for Travel Scammers: A.I.-Generated Guidebooks; The New York Times, August 5, 2023

 Seth Kugel and A New Frontier for Travel Scammers: A.I.-Generated Guidebooks

"Though she didn’t know it at the time, Ms. Kolsky had fallen victim to a new form of travel scam: shoddy guidebooks that appear to be compiled with the help of generative artificial intelligence, self-published and bolstered by sham reviews, that have proliferated in recent months on Amazon.

The books are the result of a swirling mix of modern tools: A.I. apps that can produce text and fake portraits; websites with a seemingly endless array of stock photos and graphics; self-publishing platforms — like Amazon’s Kindle Direct Publishing — with few guardrails against the use of A.I.; and the ability to solicit, purchase and post phony online reviews, which runs counter to Amazon’s policies and may soon face increased regulation from the Federal Trade Commission.

The use of these tools in tandem has allowed the books to rise near the top of Amazon search results and sometimes garner Amazon endorsements such as “#1 Travel Guide on Alaska.”"

Friday, July 28, 2023

The Guardian’s editorial code has been updated – here’s what to expect; The Guardian, July 27, 2023

 , The Guardian; The Guardian’s editorial code has been updated – here’s what to expect

"Much has changed since 2011 – at the Guardian, in the way society shares information and opinions, and in the world at large. The updates reflect this. But as “the embodiment of the Guardian’s values”, which is how the editor-in-chief, Katharine Viner, described the code in an email to staff today, the standards by which journalists agree to be held accountable, while geared (as far as possible) to the modern environment, seek to maintain something immutable: trust."

Tuesday, December 6, 2022

The Supreme Court Needs Real Oversight; The Atlantic, December 5, 2022

Glenn Fine, The Atlantic; The Supreme Court Needs Real Oversight

"A series of recent events at the Supreme Court threatens to undermine trust and confidence in the institution and demonstrates the need for it to have a code of ethics and for better oversight within the judiciary...

First, a code of judicial ethics should apply to Supreme Court justices. The Supreme Court should explicitly state that the Judicial Code of Conduct applies to it, or implement a modified code that does.

Second, the justices should be more transparent about their recusal decisions. They should explain the reasoning for their decisions to recuse, or not to recuse, themselves in significant cases.

Third, the judiciary as a whole should be subject to inspector-general oversight—to investigate alleged misconduct and to promote efficiency throughout the judiciary’s administrative operations, not to second-guess any judicial opinion. An experienced, permanent, internal judiciary inspector general, potentially reporting to the chief justice, could be structured to ensure that the judiciary maintains its institutional independence but employs more effective oversight.

In short, the Court needs to assure the public that it is governed by ethical rules and that each justice is not voluntarily judging his or her own compliance with ethical requirements. Supreme Court justices are not above the law or ethical rules. The Court’s failure to adopt an ethical code and its resistance to oversight risk further decline in public trust and confidence."

Wednesday, May 25, 2022

Editorial: A code of ethics could help the Supreme Court maintain integrity; Chicago Tribune, May 23, 2022

"That integrity can be strengthened if the Supreme Court adopted a code of ethics that would help justices navigate potential instances of undue influence and other judicial tripwires.

Like umpires, the Supreme Court may not be infallible in our democracy but its judgments are final. If justices cannot display independence from outside influences, then perhaps a code of ethics can restore the confidence and trust in the body that as begun to wane among an increasing number of Americans."

Friday, May 6, 2022

What Is Happening to the People Falling for Crypto and NFTs; The New York Times, May 5, 2020

 Farhad Manjoo, The New York Times; What Is Happening to the People Falling for Crypto and NFTs

"In the past year Yuga Labs, the well-funded start-up that makes Bored Apes, has embarked on a parade of new and even farther-out digital spinoffs of its simians. Its latest ventures have highlighted the head-scratching, money-burning, broken-casino vibe of what’s being called the internet’s next big thing. Cryptocurrencies, blockchains, NFTs and the constellation of hyped-up technologies known as “web3” have been celebrated as a way to liberate the internet from the tech giants who control it now. Instead what’s happening with Bored Apes suggests they’re doing the opposite: polluting the digital world in a thick haze of errors, swindles and expensive, largely unregulated financial speculation that ruins whatever scrap of trust still remains online...

But how many people have to lose their shirts before we realize that web3 isn’t a solution to any of our problems?"

Monday, February 28, 2022

How to avoid falling for and spreading misinformation about Ukraine; The Washington Post, February 24, 2022

Heather Kelly, The Washington Post ; How to avoid falling for and spreading misinformation about Ukraine

"Anyone with a phone and an Internet connection is able to watch the war in Ukraine unfold live online, or at least some version of it. Across social media, posts are flying up faster than most fact-checkers and moderators can handle, and they’re an unpredictable mix of true, fake, out of context and outright propaganda messages.

How do you know what to trust, what not to share and what to report? Tech companies have said they’re trying to do more to help users spot misinformation about Ukraine, with labels and fact checking. On Saturday, Facebook parent company Meta announced it was adding more fact-checkers in the region dedicated to posts about the war. It’s also warning users who attempt to share war-related photo when they’re more than a year old — a common type of misinformation.

Here are some basic tools everyone should use when consuming breaking news online."

Monday, January 3, 2022

Why your local library might be hiring a social worker; NPR, January 3, 2021

DARIAN BENSON, NPRWhy your local library might be hiring a social worker

"For years, libraries have been a place people turn to for information to help them solve problems. But the challenges patrons are dealing with are increasingly beyond the scope of what most librarians are trained to handle — and that's where social workers can fill in the gaps."

Friday, December 31, 2021

Americans widely distrust Facebook, TikTok and Instagram with their data, poll finds; The Washington Post, December 22, 2021

 

, The Washington Post; Americans widely distrust Facebook, TikTok and Instagram with their data, poll finds

"According to the survey, 72 percent of Internet users trust Facebook “not much” or “not at all” to responsibly handle their personal information and data on their Internet activity. About 6 in 10 distrust TikTok and Instagram, while slight majorities distrust WhatsApp and YouTube. Google, Apple and Microsoft receive mixed marks for trust, while Amazon is slightly positive with 53 percent trusting the company at least “a good amount.” (Amazon founder Jeff Bezos owns The Washington Post.)

Only 10 percent say Facebook has a positive impact on society, while 56 percent say it has a negative impact and 33 percent say its impact is neither positive nor negative. Even among those who use Facebook daily, more than three times as many say the social network has a negative rather than a positive impact."

Sunday, March 14, 2021

Zoom classes felt like teaching into a void — until I told my students why; The Washington Post; March 11, 2021

C. Thi Nguyen, The Washington Post; Zoom classes felt like teaching into a void — until I told my students why

"In the end, I see this as a question of informed choice. Given who I am, it’s very predicable that my teaching will get worse as more cameras go off. Students deserve to know that, and take that into account, in their own choices. I suspect that honesty is the best we can do right now.

This experience has also changed how I behave when I’m on the other side of the exchange — in the audience of an online lecture. In that situation, I would almost always prefer to turn my camera off. But now I go camera-on most of the time, because of my understanding of the impact of my decision on the speaker.

Right now, our knowledge of one another’s lives is slim, gathered as it is through impoverished channels like Zoom. When our connections are so tenuous, a little trust can go a long way."

Thursday, March 4, 2021

Emerging technologies pose ethical quandaries. Where does IT leadership fit in?; CIO Dive, February 22, 2021

Katie Malone, CIO Dive; Emerging technologies pose ethical quandaries. Where does IT leadership fit in?

""More organizations are seeing that trust is a measurement of profitability, of organizational health, of success," said Catherine Bannister, Tech Savvy and ethical tech leader at Deloitte. "This notion of ethics is becoming much more visible to stakeholders across the board and they are using that as a measure of trust, both internally and externally."

But there's no common definition for what ethical technology looks like and the conversation is ongoing. Instead, CIOs and other members of IT leadership are responsible for figuring out what tech ethics mean for their organizations in the near- and long-term. 

If an organization doesn't do its ethical due diligence, customers will catch on and trust will be diminished, according to Bannister."

Tuesday, August 11, 2020

The Anonymous Professor Who Wasn’t; The New York Times, August 4, 2020

Jonah Engel Bromwich and , The New York Times; The Anonymous Professor Who Wasn’t

A professor at Arizona State University does not exist.

"Among scientists and academics, the shock of mourning was already laced with suspicion. Enough of them had unpleasant interactions with the combative account and were troubled by its inconsistencies and seeming about-turns.

“You have these internal alarms that are like, ‘Oh, I don’t trust you,’” said Julie Libarkin, the head of the Geocognition Research Laboratory at Michigan State University. “Kind of the same as when I worked with BethAnn.”"