Showing posts with label ethical reasoning. Show all posts
Showing posts with label ethical reasoning. Show all posts

Tuesday, June 24, 2025

Jenkins Center for Virtue Ethics receives grant to advance love-based ethical framework; University of Notre Dame, June 23, 2025

Laura Moran Walton, University of Notre Dame ; Jenkins Center for Virtue Ethics receives grant to advance love-based ethical framework

"The University of Notre Dame has received a $10 million grant from the John Templeton Foundation to support a project titled Love and Social Transformation: Empowering Scholars and Social Innovators to Develop the Love Ethic. Implementation of this grant, which is the largest Notre Dame has ever received from the Templeton Foundation, will be led by the Rev. John I. Jenkins, C.S.C., Center for Virtue Ethics, the locus for research and moral formation within the Institute for Ethics and the Common Good.

“We are deeply grateful to the Templeton Foundation for its generous support of this important work. By emphasizing the ethics of abundant love, Notre Dame’s Jenkins Center for Virtue Ethics has a critical role to play in contributing to contemporary ethics,” said University President Rev. Robert A. Dowd, C.S.C. “The Catholic tradition of virtue ethics, like those of other world religions, offers a richer, fuller understanding of hope to the world, and this is a most fitting topic for the Jenkins Center’s first major initiative.”

The Love and Social Transformation project will bring scholars, writers, nonprofit leaders and others together to advance a framework that captures the power, richness and applicability of the love ethic — a core component of many faith traditions throughout the world.

“In our fractious, uncertain time, there is an urgent need for serious reflection on an ethic of love,” said University President Emeritus Rev. John I. Jenkins, C.S.C. “Emerging from the great religious traditions, the call to love has been behind some of the most transformative and enduring advances in human history. I am grateful to the Templeton Foundation for giving Notre Dame this opportunity.”

Love-based ethical insights have powered some of the most important social movements of the past century, such as Mahatma Gandhi’s Satyagraha movement in India and Martin Luther King Jr.’s civil rights leadership in the United States. But in the 21st century, the more common approaches to ethical decision-making — especially in policy realms — focus instead on cost-benefit analysis.

“These frameworks neglect the dimensions of life that fit into the rich tradition of virtue ethics — moral touchpoints such as love, dignity and awe,” said Meghan Sullivan, the Wilsey Family College Professor of Philosophy, director of the Institute for Ethics and the Common Good and the Notre Dame Ethics Initiative, and principal investigator for the grant.

“In contrast, the love ethic has three components: It holds that a widespread, non-merit-based trait like dignity is what grounds moral significance for each one of us; it is built around principles that situate interpersonal love at the foundations of our ethical reasoning; and it suggests love-oriented policies on diverse social issues as well as a love-oriented way of life.”"

Tuesday, January 29, 2019

The unnatural ethics of AI could be its undoing; The Outline, January 29, 2019

, The Outline; The unnatural ethics of AI could be its undoing

"When I used to teach philosophy at universities, I always resented having to cover the Trolley Problem, which struck me as everything the subject should not be: presenting an extreme situation, wildly detached from most dilemmas the students would normally face, in which our agency is unrealistically restricted, and using it as some sort of ideal model for ethical reasoning (the first model of ethical reasoning that many students will come across, no less). Ethics should be about things like the power structures we enter into at work, what relationships we decide to pursue, who we are or want to become — not this fringe-case intuition-pump nonsense.

But maybe I’m wrong. Because, if we believe tech gurus at least, the Trolley Problem is about to become of huge real-world importance. Human beings might not find themselves in all that many Trolley Problem-style scenarios over the course of their lives, but soon we're going to start seeing self-driving cars on our streets, and they're going to have to make these judgments all the time. Self-driving cars are potentially going to find themselves in all sorts of accident scenarios where the AI controlling them has to decide which human lives it ought to preserve. But in practice what this means is that human beings will have to grapple with the Trolley Problem — since they're going to be responsible for programming the AIs...

I'm much more sympathetic to the “AI is bad” line. We have little reason to trust that big tech companies (i.e. the people responsible for developing this technology) are doing it to help us, given how wildly their interests diverge from our own."

Monday, January 28, 2019

Embedding ethics in computer science curriculum: Harvard initiative seen as a national model; Harvard, John A. Paulson School of Engineering and Applied Sciences, January 28, 2019

Paul Karoff, Harvard, John A. Paulson School of Engineering and Applied Sciences; Embedding ethics in computer science curriculum:
Harvard initiative seen as a national model

"Barbara Grosz has a fantasy that every time a computer scientist logs on to write an algorithm or build a system, a message will flash across the screen that asks, “Have you thought about the ethical implications of what you’re doing?”
 
Until that day arrives, Grosz, the Higgins Professor of Natural Sciences at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), is working to instill in the next generation of computer scientists a mindset that considers the societal impact of their work, and the ethical reasoning and communications skills to do so.

“Ethics permeates the design of almost every computer system or algorithm that’s going out in the world,” Grosz said. “We want to educate our students to think not only about what systems they could build, but whether they should build those systems and how they should design those systems.”"

Sunday, January 31, 2016

iFest 2016: Feb. 1, 2016 10 AM - 11 AM Workshop: "Ethics, the Great Dilemma, and Managing through Conflict"

iFest 2016: Feb. 1, 2016 Workshop: "Ethics, the Great Dilemma, and Managing through Conflict" : "Monday, February 1
Workshop: "Ethics, the Great Dilemma, and Managing through Conflict"
Facilitators: Leona Mitchell, Visiting Professor of Practice and Former IBM Executive; Kip Currier, Assistant Professor, PhD, JD
Monday, February 1, 10:00 - 11:00 AM
3rd Floor Theatre, School of Information Sciences
Anyone whose professional path involves working in teams, managing others, serving a client, or being a client, knows that conflicts can consume an inordinate amount of time and can be the most challenging barriers to a successful outcome. Join Leona Mitchell, professor of practice in the School of Information Sciences (and with over a decade of senior leadership experience at IBM) and Kip Currier, Assistant Professor, PhD, JD at the iSchool at Pitt, as they share philosophies and strategies on identifying, managing, and resolving conflicts. These strategies are applicable to both classroom and work settings, and this session ls open to all students at all levels."