Showing posts with label ethical questions. Show all posts
Showing posts with label ethical questions. Show all posts

Tuesday, November 26, 2024

We need to start wrestling with the ethics of AI agents; MIT Technology Review, November 26, 2024

James O'Donnell, MIT Technology Review; We need to start wrestling with the ethics of AI agents

"The first, called tool-based agents, can be coached using natural human language (rather than coding) to complete digital tasks for us. Anthropic released one such agent in October—the first from a major AI model-maker—that can translate instructions (“Fill in this form for me”) into actions on someone’s computer, moving the cursor to open a web browser, navigating to find data on relevant pages, and filling in a form using that data. Salesforce has released its own agent too, and OpenAI reportedly plans to release one in January. 

The other type of agent is called a simulation agent, and you can think of these as AI models designed to behave like human beings. The first people to work on creating these agents were social science researchers. They wanted to conduct studies that would be expensive, impractical, or unethical to do with real human subjects, so they used AI to simulate subjects instead. This trend particularly picked up with the publication of an oft-cited 2023 paper by Joon Sung Park, a PhD candidate at Stanford, and colleagues called “Generative Agents: Interactive Simulacra of Human Behavior.”... 

If such tools become cheap and easy to build, it will raise lots of new ethical concerns, but two in particular stand out. The first is that these agents could create even more personal, and even more harmful, deepfakes. Image generation tools have already made it simple to create nonconsensual pornography using a single image of a person, but this crisis will only deepen if it’s easy to replicate someone’s voice, preferences, and personality as well. (Park told me he and his team spent more than a year wrestling with ethical issues like this in their latest research project, engaging in many conversations with Stanford’s ethics board and drafting policies on how the participants could withdraw their data and contributions.) 

The second is the fundamental question of whether we deserve to know whether we’re talking to an agent or a human. If you complete an interview with an AI and submit samples of your voice to create an agent that sounds and responds like you, are your friends or coworkers entitled to know when they’re talking to it and not to you? On the other side, if you ring your cell service provider or doctor’s office and a cheery customer service agent answers the line, are you entitled to know whether you’re talking to an AI?

This future feels far off, but it isn’t. There’s a chance that when we get there, there will be even more pressing and pertinent ethical questions to ask. In the meantime, read more from my piece on AI agents here, and ponder how well you think an AI interviewer could get to know you in two hours."

Thursday, June 6, 2024

Architects Talking Ethics #3: I’m confused: Where can I get answers to the ethical questions that come up in my practice?; The Architect's Newspaper, June 3, 2024

  , The Architect's Newspaper; Architects Talking Ethics #3:

I’m confused: Where can I get answers to the ethical questions that come up in my practice?

"This is the third entry in Architects Talking Ethics, an advice column that intends to host a discussion of the values that architects embody or should embody. It aims to answer real-world ethical questions posed by architects, designers, students, and professors.

We, as the three initial authors of this column, think the profession is way behind in how it addresses ethics. We think architects should explore our own ethics with the breadth and depth that other fields have done for a long time...

Architectural practitioners sometimes confuse ordinary ethics or business ethics with professional ethics. Ordinary ethics considers how we all should treat one another, while business ethics deals with the conflicts that can arise when balancing your company’s interests and those of your employees against those of clients. Both of these are incredibly important. However, in the world of professional ethics, where “professional” indicates those licensed to perform defined activities by the state, the first consideration is one’s duty to the public. Architects, in other words, have fiduciary responsibilities to clients and employees, professional obligations to colleagues and the discipline, and, like all professions, an overriding responsibility to the public.

Our profession’s codes of ethics as outlined by the American Institute of Architects (AIA, which again regulates only those architects volunteering to be members of its organization), however, are less than clear about the order of those obligations."

Friday, August 25, 2023

Who owns your cells? Legacy of Henrietta Lacks raises ethical questions about profits from medical research; Cleveland.com, August 18, 2023

Who owns your cells? Legacy of Henrietta Lacks raises ethical questions about profits from medical research

"While the legal victory may have given the family some closure, it has raised concerns for bioethicists in Cleveland and elsewhere.

The case raises important questions about owning one’s own body; whether individuals are entitled to a share of the profits from medical discoveries derived from research on their own cells, organs and genetic material.

But it also offers a tremendous opportunity to not only acknowledge the ethical failures of the past and the seeds of mistrust they have sown, but to guide society toward building better, more trustworthy medical institutions, said Aaron Goldenberg, who directs the Bioethics Center for Community Health and Genomic Equity (CHANGE) at Case Western Reserve University."

Tuesday, July 4, 2023

Legitimacy Of 'Customer' In Supreme Court Gay Rights Case Raises Ethical, Legal Flags; AP via Huff Post, July 3, 2023

Alanna Durkin Richer and Colleen Slevin , AP via Huff Post; Legitimacy Of 'Customer' In Supreme Court Gay Rights Case Raises Ethical, Legal Flags

"A Christian graphic artist who the Supreme Court said can refuse to make wedding websites for gay couples pointed during her lawsuit to a request from a man named “Stewart” and his husband-to-be. The twist? Stewart says it never happened.

The revelation has raised questions about how Lorie Smith’s case was allowed to proceed all the way to the nation’s highest court with such an apparent misrepresentation and whether the state of Colorado, which lost the case last week, has any legal recourse...

COULD THE REVELATION IMPACT THE CASE NOW?

It’s highly unlikely. The would-be customer’s request was not the basis for Smith’s original lawsuit, nor was it cited by the high court as the reason for ruling in her favor. Legal standing, or the right to bring a lawsuit, generally requires the person bringing the case to show that they have suffered some sort of harm. But pre-enforcement challenges — like the one Smith brought — are allowed in certain cases if the person can show they face a credible threat of prosecution or sanctions unless they conform to the law.

The 10th U.S. Circuit Court of Appeals, which reviewed the case before the Supreme Court, found that Smith had standing to sue. That appeals court noted that Colorado had a history of past enforcement “against nearly identical conduct” and that the state decline to promise that it wouldn’t go after Smith if she violated the law."

Tuesday, June 27, 2023

Ethics in the digital era; The Times of Israel, June 27, 2023

The Times of Israel; Ethics in the digital era

"A new course offered by Dr. Jeremy Fogel at the Efi Arazi School of Computer Science presents fresh perspectives on issues that Computer Science students at Reichman University will deal with in their careers. According to Dr. Fogel, a lecturer in Jewish philosophy, “The role of an educational institution is not only to transmit information, but also to cultivate and encourage the development of ethical thinking amongst its students and give them the space to do so.”

Students are being asked to discuss moral issues that have arisen as a result of the Digital Revolution, using the viewpoints of great philosophers such as Plato, Socrates, Jean-Jacques Rousseau, etc. Dr. Fogel believes that analyzing current digital developments through the eyes of these philosophers might give students some insights about these developments. Since reality constantly changes with new initiatives and inventions, it has become very hard to explore their ethical outcomes...

Dr. Fogel explains that there’s an ethical component in every action we take in our lives, such as what we eat, where we work, etc. When our students develop their new application or software, they will have to ask themselves, “What are the moral and ethical issues that could arise by using this?” Dr. Fogel also says that “The students I have met, want to make the world a better place. I am not teaching them anything new; they already have these ethical questions in their minds. I am just giving them the tools and inspiration to try and answer them.”"

Sunday, February 6, 2022

No Way Home Foreshadows The Greatest Problem With The X-Men; ScreenRant, February 3, 2022

Thomas Bacon, ScreenRant; No Way Home Foreshadows The Greatest Problem With The X-Men

"Spider-Man is the only Avenger to date who has been a teenager in this shared universe. That's given his solo films a unique feel in the MCU, but it's also posed serious ethical questions about whether or not the Avengers should allow Spider-Man to join in with their superhero fights. It didn't take long for War Machine to pick up on this in Captain America: Civil War, with Tony Stark brushing the question of Spider-Man's age aside. "I don't know, I didn't carbon-date him," Iron Man defended himself. "He's on the young side." The question of Spider-Man's age surfaced again in Spider-Man: No Way Home, when Peter Parker's beloved Aunt May was accused of child endangerment because she had allowed him to act as a hero. "Child endangerment's a nasty rap," Agent Cleary accused her. "A boy was entrusted to you, and as his legal guardian - essentially his mother - you not only allowed him to endanger himself, but you actually encouraged it. Who does that?" It's true this was just a throwaway scene, and the ethical considerations weren't subsequently explored in greater depth - but the question is a chilling one nonetheless, and it has serious implications for the future of the MCU, particularly how it relates to the X-Men."

Thursday, October 28, 2021

This Program Can Give AI a Sense of Ethics—Sometimes; Wired, October 28, 2021

 ,Wired; This Program Can Give AI a Sense of Ethics—Sometimes

"Frost says the debate around Delphi reflects a broader question that the tech industry is wrestling with—how to build technology responsibly. Too often, he says, when it comes to content moderation, misinformation, and algorithmic bias, companies try to wash their hands of the problem by arguing that all technology can be used for good and bad.

When it comes to ethics, “there’s no ground truth, and sometimes tech companies abdicate responsibility because there’s no ground truth,” Frost says. “The better approach is to try.”"

 

Sunday, December 30, 2018

Colleges Grapple With Teaching the Technology and Ethics of A.I.; The New York Times, November 2, 2018

Alina Tugend, The New York Times;Colleges Grapple With Teaching the Technology and Ethics of A.I.


"At the University of Washington, a new class called “Intelligent Machinery, Identity and Ethics,” is being taught this fall by a team leader at Google and the co-director of the university’s Computational Neuroscience program.

Daniel Grossman, a professor and deputy director of undergraduate studies at the university’s Paul G. Allen School of Computer Science and Engineering, explained the purpose this way:

The course “aims to get at the big ethical questions we’ll be facing, not just in the next year or two but in the next decade or two.”

David Danks, a professor of philosophy and psychology at Carnegie Mellon, just started teaching a class, “A.I, Society and Humanity.” The class is an outgrowth of faculty coming together over the past three years to create shared research projects, he said, because students need to learn from both those who are trained in the technology and those who are trained in asking ethical questions.

“The key is to make sure they have the opportunities to really explore the ways technology can have an impact — to think how this will affect people in poorer communities or how it can be abused,” he said."

Tuesday, July 31, 2018

Big Data Is Getting Bigger. So Are the Privacy and Ethical Questions.; The Chronicle of Higher Education, July 31, 2018

Goldie Blumenstyk, The Chronicle of Higher Education; Big Data Is Getting Bigger. So Are the Privacy and Ethical Questions.

"Big data is getting bigger. So are the privacy and ethical questions.

The next step in using “big data” for student success is upon us. It’s a little cool. And also kind of creepy.

This new approach goes beyond the tactics now used by hundreds of colleges, which depend on data collected from sources like classroom teaching platforms and student-information systems. It not only makes a technological leap; it also raises issues around ethics and privacy.

Here’s how it works: Whenever you log on to a wireless network with your cellphone or computer, you leave a digital footprint. Move from one building to another while staying on the same network, and that network knows how long you stayed and where you went. That data is collected continuously and automatically from the network’s various nodes.

Now, with the help of a company called Degree Analytics, a few colleges are beginning to use location data collected from students’ cellphones and laptops as they move around campus. Some colleges are using it to improve the kind of advice they might send to students, like a text-message reminder to go to class if they’ve been absent."

Sunday, April 1, 2018

THE TRICKY ETHICS OF THE NFL'S NEW OPEN DATA POLICY; Wired, March 29, 2018

Ian McMahan, Wired; THE TRICKY ETHICS OF THE NFL'S NEW OPEN DATA POLICY

"SINCE 2015, EVERY player in the National Football Leaguehas been part cyborg. Well, kind of: Embedded in their shoulder pads is an RFID chip that can measure speed, distance traveled, acceleration, and deceleration. Those chips broadcast movement information, accurate to within six inches, to electronic receivers in every stadium. Even the balls carry chips.

So far, that data has stayed within the walls of each individual team, helping players and coaches understand offensive and defensive patterns. But this week, the NFL’s competition committee made good on its intention to share data on all 22 players after every game—with all the teams.

That move will give competitors a greater understanding of player movement across the league. But it could also begin to change the essence of the game. Much of the challenge of sports is the ability to quickly process and react to information, an instinctual gift of great coaches and players. By stripping away some of the uncertainty of competition, data will shift who holds that analytical advantage—and introduce some new ethical questions."

Saturday, March 24, 2018

Driverless cars raise so many ethical questions. Here are just a few of them.; San Diego Union-Tribune, March 23, 2018

Lawrence M. Hinman, San Diego Union-Tribune; Driverless cars raise so many ethical questions. Here are just a few of them.

"Even more troubling will be the algorithms themselves, even if the engineering works flawlessly. How are we going to program autonomous vehicles when they are faced with a choice among competing evils? Should they be programmed to harm or kill the smallest number of people, swerving to avoid hitting two people but unavoidably hitting one? (This is the famous “trolley problem” that has vexed philosophers and moral psychologists for over half a century.)

Should your car be programmed to avoid crashing into a group of schoolchildren, even if that means driving you off the side of a cliff? Most of us would opt for maximizing the number of lives saved, except when one of those lives belongs to us or our loved ones.

These are questions that take us to the heart of the moral life in a technological society. They are already part of a lively and nuanced discussion among philosophers, engineers, policy makers and technologists. It is a conversation to which the larger public should be invited.

The ethics of dealing with autonomous systems will be a central issue of the coming decades."