Showing posts with label humanities. Show all posts
Showing posts with label humanities. Show all posts

Friday, April 29, 2022

LSU to Embed Ethics in the Development of New Technologies, Including AI; LSU Office of Research and Economic Development, April 2022

Elsa Hahne, LSU Office of Research and Economic Development ; LSU to Embed Ethics in the Development of New Technologies, Including AI

"“If we want to educate professionals who not only understand their professional obligations but become leaders in their fields, we need to make sure our students understand ethical conflicts and how to resolve them,” Goldgaber said. “Leaders don’t just do what they’re told—they make decisions with vision.”

The rapid development of new technologies has put researchers in her field, the world of Socrates and Rousseau, in the new and not-altogether-comfortable role of providing what she calls “ethics emergency services” when emerging capabilities have unintended consequences for specific groups of people.

“We can no longer rely on the traditional division of labor between STEM and the humanities, where it’s up to philosophers to worry about ethics,” Goldgaber said. “Nascent and fast-growing technologies, such as artificial intelligence, disrupt our everyday normative understandings, and most often, we lack the mechanisms to respond. In this scenario, it’s not always right to ‘stay in your lane’ or ‘just do your job.’”

Sunday, January 23, 2022

The Humanities Can't Save Big Tech From Itself; Wired, January 12, 2022

, Wired; The Humanities Can't Save Big Tech From Itself

 "I’ve been studying nontechnical workers in the tech and media industries for the past several years. Arguments to “bring in” sociocultural experts elide the truth that these roles and workers already exist in the tech industry and, in varied ways, always have. For example, many current UX researchers have advanced degrees in sociology, anthropology, and library and information sciences. And teachers and EDI (Equity, Diversity, and Inclusion) experts often occupy roles in tech HR departments.

Recently, however, the tech industry is exploring where nontechnical expertise might counter some of the social problems associated with their products. Increasingly, tech companies look to law and philosophy professors to help them through the legal and moral intricacies of platform governance, to activists and critical scholars to help protect marginalized users, and to other specialists to assist with platform challenges like algorithmic oppression, disinformation, community management, user wellness, and digital activism and revolutions. These data-driven industries are trying hard to augment their technical know-how and troves of data with social, cultural, and ethical expertise, or what I often refer to as “soft” data.

But you can add all of the soft data workers you want and little will change unless the industry values that kind of data and expertise. In fact, many academics, policy wonks, and other sociocultural experts in the AI and tech ethics space are noticing a disturbing trend of tech companies seeking their expertise and then disregarding it in favor of more technical work and workers...

Finally, though the librarian profession is often cited as one that might save Big Tech from its disinformation dilemmas, some in LIS (Library and Information Science) argue they collectively have a long way to go before they’re up to the task. Safiya Noble noted the profession’s (just over 83% white) “colorblind” ideology and sometimes troubling commitment to neutrality. This commitment, the book Knowledge Justice explains, leads to many librarians believing, “Since we serve everyone, we must allow materials, ideas, and values from everyone.” In other words, librarians often defend allowing racist, transphobic, and other harmful information to stand alongside other materials by saying they must entertain “all sides” and allow people to find their way to the “best” information. This is the exact same error platforms often make in allowing disinformation and abhorrent content to flourish online."

Wednesday, May 6, 2020

Seeking Ethics Through Narrative During COVID-19; PittWire, April 16, 2020

Amerigo Allegretto, PittWire; Seeking Ethics Through Narrative During COVID-19

"In her redesigned Literature and Medicine course, Uma Satyavolu challenges students to study both past and current writings to deal ethically with pandemics such as COVID-19.

“The moment I heard of this pandemic, I reached for Albert Camus’ ‘The Plague’ and Daniel Defoe’s ‘A Journal of the Plague Year,’” said Satyavolu, a lecturer in the University of Pittsburgh Department of English in the Kenneth P. Dietrich School of Arts and Sciences. “I often teach the latter in my Essay and Memoir class. These books help people understand how people dealt with disruptions and being isolated due to epidemics in previous generations, like we relatively are today,” with stay-at-home orders in place in much of the U.S.

The course has pivoted to having students analyze narratives surrounding COVID-19 to tracehow medical knowledge is or is not transmitted during the pandemic, with particular attention to how some narratives gain authority and the status of “truth.” Their analyses will be posted in April on the Center for Bioethics and Health Law’s website, COVID-19 Narratives.

“What I want students to take away from this course is that we’re not just reading a few books. This is a form of important public engagement,” she said. “The course is based upon the idea of literature and the humanities serve as a bridge between ‘expert’ knowledge and the general public.”...

But Satyavolu isn’t stopping with the course; she also recently led the gathering of medical humanities materials to create COVID-19 Medical Humanities Resources, a webpage that went live in late March. The resource page contains suggested novels, essays, podcasts and films to analyze how stories taking place during epidemics and pandemics are told...

People interested in the ethical issues raised by the pandemic can visit the Center’s COVID-19 Ethics Resources webpage."

Tuesday, April 7, 2020

Artificial Intelligence and COVID-19: How Technology Can Understand, Track, and Improve Health Outcomes; Stanford University, April 2, 2020

Shana Lynch, Stanford University; Artificial Intelligence and COVID-19: How Technology Can Understand, Track, and Improve Health Outcomes


"On April 1, nearly 30 artificial intelligence (AI) researchers and experts met virtually to discuss ways AI can help understand COVID-19 and potentially mitigate the disease and developing public health crisis.

COVID-19 and AI: A Virtual Conference, hosted by the Stanford Institute for Human-Centered Artificial Intelligence, brought together Stanford faculty across medicine, computer science, and humanities; politicians, startup founders, and researchers from universities across the United States.

“In these trying times, I am especially inspired by the eagerness and diligence of scientists, clinicians, mathematicians, engineers, and social scientists around the world that are coming together to combat this pandemic,” Fei-Fei Li, Denning Family Co-Director of Stanford HAI, told the live audience.

Here are the top-line takeaways from the day. Visit HAI’s website for more in-depth coverage or watch the full conference video."

Saturday, January 12, 2019

University Data Science Programs Turn to Ethics and the Humanities; EdSurge, January 11, 2019

Sydney Johnson, EdSurge; University Data Science Programs Turn to Ethics and the Humanities 

 

"These days a growing number of people are concerned with bringing more talk of ethics into technology. One question is whether that will bring change to data-science curricula...

“You don't just throw algorithms at data. You need to look at it, understand how it was collected, and ask yourself: ‘How can I be responsible with the data and the people from which it came?’” says Cathryn Carson, a UC Berkeley historian with a background in physics who steered the committee tasked with designing the schools’ data-science curriculum.

The new division goes a step further than adding an ethics course to an existing program. “Computer science has been trying to catch up with the ethical implications of what they are already doing,” Carson says. “Data science has this built in from the start, and you’re not trying to retrofit something to insert ethics—it's making it a part of the design principle.”"

 

Tuesday, October 16, 2018

The Most Important Skills for the 4th Industrial Revolution? Try Ethics and Philosophy.; EdSurge, October 6, 2018

Tony Wan, EdSurge; The Most Important Skills for the 4th Industrial Revolution? Try Ethics and Philosophy.

"[Patrick] Awuah [founder and president of Ashesi University College in Ghana], a MacArthur Fellowship recipient, continued: “If humans are designing machines to replace humans, versus helping them get work done, then that will change the structure of humanity to something that we have never seen. I’ve not read any history books where whole societies were not working. This is why it’s so important to have history and philosophy as part of the curriculum for somebody who's being educated as an engineer.”

In the United States, increased interest in technology and computer-science related career has correlated with a precipitous drop in the proportion of humanities majors at colleges. For Goodman, that’s one of his biggest worries for the future. “We’re entering a time when schools are eliminating programs in humanities, and philosophy departments are becoming an endangered species.”

“We need to be educating people so they are productive and employable,” Awuah later added. “But we also need to be educating people so that they’re creating a society that is livable and social, where human interaction is important.”"

Thursday, August 2, 2018

The Expensive Education of Mark Zuckerberg and Silicon Valley; The New York Times, August 2, 2018

Kara Swisher, The New York Times;

The Expensive Education of Mark Zuckerberg and Silicon Valley


"All these companies began with a gauzy credo to change the world. But they have done that in ways they did not imagine — by weaponizing pretty much everything that could be weaponized. They have mutated human communication, so that connecting people has too often become about pitting them against one another, and turbocharged that discord to an unprecedented and damaging volume.

They have weaponized social media. They have weaponized the First Amendment. They have weaponized civic discourse. And they have weaponized, most of all, politics...

Because what he never managed to grok then was that the company he created was destined to become a template for all of humanity, the digital reflection of masses of people across the globe. Including — and especially — the bad ones.

Was it because he was a computer major who left college early and did not attend enough humanities courses that might have alerted him to the uglier aspects of human nature? Maybe."