Showing posts with label computer science. Show all posts
Showing posts with label computer science. Show all posts

Monday, December 4, 2023

Unmasking AI's Racism And Sexism; NPR, Fresh Air, November 28, 2023

 NPR, Fresh Air; Unmasking AI's Racism And Sexism

"Computer scientist and AI expert Joy Buolamwini warns that facial recognition technology is riddled with the biases of its creators. She is the author of Unmasking AI and founder of the Algorithmic Justice League. She coined the term "coded gaze," a cousin to the "white gaze" or "male gaze." She says, "This is ... about who has the power to shape technology and whose preferences and priorities are baked in — as well as also, sometimes, whose prejudices are baked in.""

Tuesday, April 7, 2020

Artificial Intelligence and COVID-19: How Technology Can Understand, Track, and Improve Health Outcomes; Stanford University, April 2, 2020

Shana Lynch, Stanford University; Artificial Intelligence and COVID-19: How Technology Can Understand, Track, and Improve Health Outcomes


"On April 1, nearly 30 artificial intelligence (AI) researchers and experts met virtually to discuss ways AI can help understand COVID-19 and potentially mitigate the disease and developing public health crisis.

COVID-19 and AI: A Virtual Conference, hosted by the Stanford Institute for Human-Centered Artificial Intelligence, brought together Stanford faculty across medicine, computer science, and humanities; politicians, startup founders, and researchers from universities across the United States.

“In these trying times, I am especially inspired by the eagerness and diligence of scientists, clinicians, mathematicians, engineers, and social scientists around the world that are coming together to combat this pandemic,” Fei-Fei Li, Denning Family Co-Director of Stanford HAI, told the live audience.

Here are the top-line takeaways from the day. Visit HAI’s website for more in-depth coverage or watch the full conference video."

Tuesday, April 16, 2019

Course organized by students tackles ethics in CS; Janet Chang, April 15, 2019

Janet Chang, The Brown Daily Herald; 

Course organized by students tackles ethics in CS


"Last spring, students in a new computer science social change course developed software tools for a disaster relief organization to teach refugee children about science and technology, a Chrome extension to filter hate speech on the internet and a mobile app to help doctors during a patient visits.

Called CSCI 1951I: “CS for Social Change,” the course — now in its second iteration — was developed for computer science, design and engineering students to discuss and reflect on the social impact of their work while building practical software tools to help local and national partner nonprofits over the 15-week semester.

The course was initially conceived by Nikita Ramoji ’20, among others, who was a co-founder of CS for Social Change, a student organization that aims to addethics education to college computer science departments. “The (general consensus) was that we were getting a really great computer science education, but we didn’t really have that social component,” she said."

Tuesday, February 26, 2019

Fixing Tech’s Ethics Problem Starts in the Classroom; The Nation, February 21, 2019

Stephanie Wykstra, The Nation; Fixing Tech’s Ethics Problem Starts in the Classroom

 

"Casey Fiesler, a faculty member in the Department of Information Science at the University of Colorado Boulder, said that a common model in engineering programs is a stand-alone ethics class, often taught towards the end of a program. But there’s increasingly a consensus among those teaching tech ethics that a better model is to discuss ethical issues alongside technical work. Evan Peck, a computer scientist at Bucknell University, writes that separating ethical from technical material means that students get practice “debating ethical dilemmas…but don’t get to practice formalizing those values into code.” This is a particularly a problem, said Fiesler, if an ethics class is taught by someone from outside a student’s field, and the professors in their computer-science courses rarely mention ethical issues. On the other hand, classes focused squarely on the ethics of technology allow students to dig deeply into complicated questions. “I think the best solution is to do both…but if you can’t do both, incorporating [ethics material into regular coursework] is the best option,” Fiesler said."

 

Monday, January 28, 2019

Embedding ethics in computer science curriculum: Harvard initiative seen as a national model; Harvard, John A. Paulson School of Engineering and Applied Sciences, January 28, 2019

Paul Karoff, Harvard, John A. Paulson School of Engineering and Applied Sciences; Embedding ethics in computer science curriculum:
Harvard initiative seen as a national model

"Barbara Grosz has a fantasy that every time a computer scientist logs on to write an algorithm or build a system, a message will flash across the screen that asks, “Have you thought about the ethical implications of what you’re doing?”
 
Until that day arrives, Grosz, the Higgins Professor of Natural Sciences at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), is working to instill in the next generation of computer scientists a mindset that considers the societal impact of their work, and the ethical reasoning and communications skills to do so.

“Ethics permeates the design of almost every computer system or algorithm that’s going out in the world,” Grosz said. “We want to educate our students to think not only about what systems they could build, but whether they should build those systems and how they should design those systems.”"

Saturday, January 12, 2019

University Data Science Programs Turn to Ethics and the Humanities; EdSurge, January 11, 2019

Sydney Johnson, EdSurge; University Data Science Programs Turn to Ethics and the Humanities 

 

"These days a growing number of people are concerned with bringing more talk of ethics into technology. One question is whether that will bring change to data-science curricula...

“You don't just throw algorithms at data. You need to look at it, understand how it was collected, and ask yourself: ‘How can I be responsible with the data and the people from which it came?’” says Cathryn Carson, a UC Berkeley historian with a background in physics who steered the committee tasked with designing the schools’ data-science curriculum.

The new division goes a step further than adding an ethics course to an existing program. “Computer science has been trying to catch up with the ethical implications of what they are already doing,” Carson says. “Data science has this built in from the start, and you’re not trying to retrofit something to insert ethics—it's making it a part of the design principle.”"

 

Sunday, December 30, 2018

Colleges Grapple With Teaching the Technology and Ethics of A.I.; The New York Times, November 2, 2018

Alina Tugend, The New York Times;Colleges Grapple With Teaching the Technology and Ethics of A.I.


"At the University of Washington, a new class called “Intelligent Machinery, Identity and Ethics,” is being taught this fall by a team leader at Google and the co-director of the university’s Computational Neuroscience program.

Daniel Grossman, a professor and deputy director of undergraduate studies at the university’s Paul G. Allen School of Computer Science and Engineering, explained the purpose this way:

The course “aims to get at the big ethical questions we’ll be facing, not just in the next year or two but in the next decade or two.”

David Danks, a professor of philosophy and psychology at Carnegie Mellon, just started teaching a class, “A.I, Society and Humanity.” The class is an outgrowth of faculty coming together over the past three years to create shared research projects, he said, because students need to learn from both those who are trained in the technology and those who are trained in asking ethical questions.

“The key is to make sure they have the opportunities to really explore the ways technology can have an impact — to think how this will affect people in poorer communities or how it can be abused,” he said."

Tuesday, October 16, 2018

The Most Important Skills for the 4th Industrial Revolution? Try Ethics and Philosophy.; EdSurge, October 6, 2018

Tony Wan, EdSurge; The Most Important Skills for the 4th Industrial Revolution? Try Ethics and Philosophy.

"[Patrick] Awuah [founder and president of Ashesi University College in Ghana], a MacArthur Fellowship recipient, continued: “If humans are designing machines to replace humans, versus helping them get work done, then that will change the structure of humanity to something that we have never seen. I’ve not read any history books where whole societies were not working. This is why it’s so important to have history and philosophy as part of the curriculum for somebody who's being educated as an engineer.”

In the United States, increased interest in technology and computer-science related career has correlated with a precipitous drop in the proportion of humanities majors at colleges. For Goodman, that’s one of his biggest worries for the future. “We’re entering a time when schools are eliminating programs in humanities, and philosophy departments are becoming an endangered species.”

“We need to be educating people so they are productive and employable,” Awuah later added. “But we also need to be educating people so that they’re creating a society that is livable and social, where human interaction is important.”"

Announcing a Competition for Ethics in Computer Science, with up to $3.5 Million in Prizes; Mozilla, October 10, 2018

Mozilla; Announcing a Competition for Ethics in Computer Science, with up to $3.5 Million in Prizes 

"With great code comes great responsibility.

Today, computer scientists wield tremendous power. The code they write can be used by billions of people, and influence everything from what news stories we read, to what personal data companies collect, to who gets parole, insurance or housing loans

Software can empower democracy, heighten opportunity, and connect people continents away. But when it isn’t coupled with responsibility, the results can be drastic. In recent years, we’ve watched biased algorithms and broken recommendation engines radicalize users, promote racism, and spread misinformation.

That’s why Omidyar Network, Mozilla, Schmidt Futures, and Craig Newmark Philanthropies are launching the Responsible Computer Science Challenge: an ambitious initiative to integrate ethics and accountability into undergraduate computer science curricula and pedagogy at U.S. colleges and universities, with up to $3.5 million in prizes."

Thursday, August 30, 2018

Honoring All Expertise: Social Responsibility and Ethics in Tech: featuring Kathy Pham & Friends from the Berkman Klein Community; Berkman Klein Luncheon Series, Harvard University, April 17, 2018

[Video] Berkman Klein Luncheon Series, Harvard University;

Honoring All Expertise: Social Responsibility and Ethics in Tech:
featuring Kathy Pham & Friends from the Berkman Klein Community


"The Ethical Tech Working Group at the Berkman Klein Center will host a series of lighting [sic] talks exploring social responsibility and ethics in tech. Speakers will draw on their perspectives as computer scientists, critical race and gender scholars, designers, ethnographers, historians, lawyers, political scientists, and philosophers to share reflections on what it will take to build more publicly-accountable technologies and how to bridge diverse expertise from across industry and academia to get there."

[Kip Currier: One of the speakers in this video is Ben Green, Computer Science PhD Student, Harvard University. His talk is titled "Travails in CS Academia".]


Ben Green quote:

[8:46 in video] "What was particularly disturbing for me as I entered the [computer science] field was to see the actual dismissal of non-technical voices and non-technical perspectives in the field. 

I had one experience where I heard a fellow graduate student of mine scoff at the idea of a social scientist being an actual scientist. And I had several conversations with faculty members in the department where they told me that the work that I wanted to do that was socially- and policy-minded was not computer science and wasn't worth doing."

Sunday, April 1, 2018

Musk and Zuckerberg are fighting over whether we rule technology—or it rules us; Quartz, April 1, 2018

Michael J. Coren, Quartz; Musk and Zuckerberg are fighting over whether we rule technology—or it rules us

"Firmly in Zuckerberg’s camp are Google co-founder Larry Page, inventor and author Ray Kurzweil, and computer scientist Andrew Ng, a prominent figure in the artificial intelligence community who previously ran the artificial intelligence unit for the Chinese company Baidu. All three seem to share the philosophy that technological progress is almost always positive, on balance, and that hindering that progress is not just bad business, but morally wrong because it deprives society of those benefits.


Musk, alongside others such as Bill Gates, the late physicist Stephen Hawking, and venture investors such as Sam Altman and Fred Wilson, do not see all technological progress as an absolute good. For this reason, they’re open to regulation...


Yonatan Zunger, a former security and privacy engineer at Google has compared software engineers’ power to that of “kids in a toy shop full of loaded AK-47’s.” It’s becoming increasingly clear how dangerous it is to consider safety and ethics elective, rather than foundational, to software design. “Computer science is a field which hasn’t yet encountered consequences,” he writes."

Wednesday, March 1, 2017

New student group tackles ethical issues in computer science; Stanford Daily, February 28, 2017

Josh Wagner, Stanford Daily; 

New student group tackles ethical issues in computer science


"Political science Professor Rob Reich, who serves as faculty director of the Stanford Center for Ethics and Society, said he was heartened by groups like EthiCS that seek to grapple with the human aspect of technology.
“If it’s anything like CS + Social Good, it’s just a welcome sign about how Stanford can combine a liberal arts education with a skill-based education,” said Reich.
Conversations like these are not restricted to the Stanford community. In early February, prominent artificial intelligence pioneers such as Elon Musk and Stephen Hawking endorsed a list of 23 principles, priorities and precautions that should guide the safe development of ethical artificial intelligence technologies."