Showing posts with label ethical questions. Show all posts
Showing posts with label ethical questions. Show all posts

Thursday, June 6, 2024

Architects Talking Ethics #3: I’m confused: Where can I get answers to the ethical questions that come up in my practice?; The Architect's Newspaper, June 3, 2024

  , The Architect's Newspaper; Architects Talking Ethics #3:

I’m confused: Where can I get answers to the ethical questions that come up in my practice?

"This is the third entry in Architects Talking Ethics, an advice column that intends to host a discussion of the values that architects embody or should embody. It aims to answer real-world ethical questions posed by architects, designers, students, and professors.

We, as the three initial authors of this column, think the profession is way behind in how it addresses ethics. We think architects should explore our own ethics with the breadth and depth that other fields have done for a long time...

Architectural practitioners sometimes confuse ordinary ethics or business ethics with professional ethics. Ordinary ethics considers how we all should treat one another, while business ethics deals with the conflicts that can arise when balancing your company’s interests and those of your employees against those of clients. Both of these are incredibly important. However, in the world of professional ethics, where “professional” indicates those licensed to perform defined activities by the state, the first consideration is one’s duty to the public. Architects, in other words, have fiduciary responsibilities to clients and employees, professional obligations to colleagues and the discipline, and, like all professions, an overriding responsibility to the public.

Our profession’s codes of ethics as outlined by the American Institute of Architects (AIA, which again regulates only those architects volunteering to be members of its organization), however, are less than clear about the order of those obligations."

Friday, August 25, 2023

Who owns your cells? Legacy of Henrietta Lacks raises ethical questions about profits from medical research; Cleveland.com, August 18, 2023

Who owns your cells? Legacy of Henrietta Lacks raises ethical questions about profits from medical research

"While the legal victory may have given the family some closure, it has raised concerns for bioethicists in Cleveland and elsewhere.

The case raises important questions about owning one’s own body; whether individuals are entitled to a share of the profits from medical discoveries derived from research on their own cells, organs and genetic material.

But it also offers a tremendous opportunity to not only acknowledge the ethical failures of the past and the seeds of mistrust they have sown, but to guide society toward building better, more trustworthy medical institutions, said Aaron Goldenberg, who directs the Bioethics Center for Community Health and Genomic Equity (CHANGE) at Case Western Reserve University."

Tuesday, July 4, 2023

Legitimacy Of 'Customer' In Supreme Court Gay Rights Case Raises Ethical, Legal Flags; AP via Huff Post, July 3, 2023

Alanna Durkin Richer and Colleen Slevin , AP via Huff Post; Legitimacy Of 'Customer' In Supreme Court Gay Rights Case Raises Ethical, Legal Flags

"A Christian graphic artist who the Supreme Court said can refuse to make wedding websites for gay couples pointed during her lawsuit to a request from a man named “Stewart” and his husband-to-be. The twist? Stewart says it never happened.

The revelation has raised questions about how Lorie Smith’s case was allowed to proceed all the way to the nation’s highest court with such an apparent misrepresentation and whether the state of Colorado, which lost the case last week, has any legal recourse...

COULD THE REVELATION IMPACT THE CASE NOW?

It’s highly unlikely. The would-be customer’s request was not the basis for Smith’s original lawsuit, nor was it cited by the high court as the reason for ruling in her favor. Legal standing, or the right to bring a lawsuit, generally requires the person bringing the case to show that they have suffered some sort of harm. But pre-enforcement challenges — like the one Smith brought — are allowed in certain cases if the person can show they face a credible threat of prosecution or sanctions unless they conform to the law.

The 10th U.S. Circuit Court of Appeals, which reviewed the case before the Supreme Court, found that Smith had standing to sue. That appeals court noted that Colorado had a history of past enforcement “against nearly identical conduct” and that the state decline to promise that it wouldn’t go after Smith if she violated the law."

Tuesday, June 27, 2023

Ethics in the digital era; The Times of Israel, June 27, 2023

The Times of Israel; Ethics in the digital era

"A new course offered by Dr. Jeremy Fogel at the Efi Arazi School of Computer Science presents fresh perspectives on issues that Computer Science students at Reichman University will deal with in their careers. According to Dr. Fogel, a lecturer in Jewish philosophy, “The role of an educational institution is not only to transmit information, but also to cultivate and encourage the development of ethical thinking amongst its students and give them the space to do so.”

Students are being asked to discuss moral issues that have arisen as a result of the Digital Revolution, using the viewpoints of great philosophers such as Plato, Socrates, Jean-Jacques Rousseau, etc. Dr. Fogel believes that analyzing current digital developments through the eyes of these philosophers might give students some insights about these developments. Since reality constantly changes with new initiatives and inventions, it has become very hard to explore their ethical outcomes...

Dr. Fogel explains that there’s an ethical component in every action we take in our lives, such as what we eat, where we work, etc. When our students develop their new application or software, they will have to ask themselves, “What are the moral and ethical issues that could arise by using this?” Dr. Fogel also says that “The students I have met, want to make the world a better place. I am not teaching them anything new; they already have these ethical questions in their minds. I am just giving them the tools and inspiration to try and answer them.”"

Sunday, February 6, 2022

No Way Home Foreshadows The Greatest Problem With The X-Men; ScreenRant, February 3, 2022

Thomas Bacon, ScreenRant; No Way Home Foreshadows The Greatest Problem With The X-Men

"Spider-Man is the only Avenger to date who has been a teenager in this shared universe. That's given his solo films a unique feel in the MCU, but it's also posed serious ethical questions about whether or not the Avengers should allow Spider-Man to join in with their superhero fights. It didn't take long for War Machine to pick up on this in Captain America: Civil War, with Tony Stark brushing the question of Spider-Man's age aside. "I don't know, I didn't carbon-date him," Iron Man defended himself. "He's on the young side." The question of Spider-Man's age surfaced again in Spider-Man: No Way Home, when Peter Parker's beloved Aunt May was accused of child endangerment because she had allowed him to act as a hero. "Child endangerment's a nasty rap," Agent Cleary accused her. "A boy was entrusted to you, and as his legal guardian - essentially his mother - you not only allowed him to endanger himself, but you actually encouraged it. Who does that?" It's true this was just a throwaway scene, and the ethical considerations weren't subsequently explored in greater depth - but the question is a chilling one nonetheless, and it has serious implications for the future of the MCU, particularly how it relates to the X-Men."

Thursday, October 28, 2021

This Program Can Give AI a Sense of Ethics—Sometimes; Wired, October 28, 2021

 ,Wired; This Program Can Give AI a Sense of Ethics—Sometimes

"Frost says the debate around Delphi reflects a broader question that the tech industry is wrestling with—how to build technology responsibly. Too often, he says, when it comes to content moderation, misinformation, and algorithmic bias, companies try to wash their hands of the problem by arguing that all technology can be used for good and bad.

When it comes to ethics, “there’s no ground truth, and sometimes tech companies abdicate responsibility because there’s no ground truth,” Frost says. “The better approach is to try.”"

 

Sunday, December 30, 2018

Colleges Grapple With Teaching the Technology and Ethics of A.I.; The New York Times, November 2, 2018

Alina Tugend, The New York Times;Colleges Grapple With Teaching the Technology and Ethics of A.I.


"At the University of Washington, a new class called “Intelligent Machinery, Identity and Ethics,” is being taught this fall by a team leader at Google and the co-director of the university’s Computational Neuroscience program.

Daniel Grossman, a professor and deputy director of undergraduate studies at the university’s Paul G. Allen School of Computer Science and Engineering, explained the purpose this way:

The course “aims to get at the big ethical questions we’ll be facing, not just in the next year or two but in the next decade or two.”

David Danks, a professor of philosophy and psychology at Carnegie Mellon, just started teaching a class, “A.I, Society and Humanity.” The class is an outgrowth of faculty coming together over the past three years to create shared research projects, he said, because students need to learn from both those who are trained in the technology and those who are trained in asking ethical questions.

“The key is to make sure they have the opportunities to really explore the ways technology can have an impact — to think how this will affect people in poorer communities or how it can be abused,” he said."

Tuesday, July 31, 2018

Big Data Is Getting Bigger. So Are the Privacy and Ethical Questions.; The Chronicle of Higher Education, July 31, 2018

Goldie Blumenstyk, The Chronicle of Higher Education; Big Data Is Getting Bigger. So Are the Privacy and Ethical Questions.

"Big data is getting bigger. So are the privacy and ethical questions.

The next step in using “big data” for student success is upon us. It’s a little cool. And also kind of creepy.

This new approach goes beyond the tactics now used by hundreds of colleges, which depend on data collected from sources like classroom teaching platforms and student-information systems. It not only makes a technological leap; it also raises issues around ethics and privacy.

Here’s how it works: Whenever you log on to a wireless network with your cellphone or computer, you leave a digital footprint. Move from one building to another while staying on the same network, and that network knows how long you stayed and where you went. That data is collected continuously and automatically from the network’s various nodes.

Now, with the help of a company called Degree Analytics, a few colleges are beginning to use location data collected from students’ cellphones and laptops as they move around campus. Some colleges are using it to improve the kind of advice they might send to students, like a text-message reminder to go to class if they’ve been absent."

Sunday, April 1, 2018

THE TRICKY ETHICS OF THE NFL'S NEW OPEN DATA POLICY; Wired, March 29, 2018

Ian McMahan, Wired; THE TRICKY ETHICS OF THE NFL'S NEW OPEN DATA POLICY

"SINCE 2015, EVERY player in the National Football Leaguehas been part cyborg. Well, kind of: Embedded in their shoulder pads is an RFID chip that can measure speed, distance traveled, acceleration, and deceleration. Those chips broadcast movement information, accurate to within six inches, to electronic receivers in every stadium. Even the balls carry chips.

So far, that data has stayed within the walls of each individual team, helping players and coaches understand offensive and defensive patterns. But this week, the NFL’s competition committee made good on its intention to share data on all 22 players after every game—with all the teams.

That move will give competitors a greater understanding of player movement across the league. But it could also begin to change the essence of the game. Much of the challenge of sports is the ability to quickly process and react to information, an instinctual gift of great coaches and players. By stripping away some of the uncertainty of competition, data will shift who holds that analytical advantage—and introduce some new ethical questions."

Saturday, March 24, 2018

Driverless cars raise so many ethical questions. Here are just a few of them.; San Diego Union-Tribune, March 23, 2018

Lawrence M. Hinman, San Diego Union-Tribune; Driverless cars raise so many ethical questions. Here are just a few of them.

"Even more troubling will be the algorithms themselves, even if the engineering works flawlessly. How are we going to program autonomous vehicles when they are faced with a choice among competing evils? Should they be programmed to harm or kill the smallest number of people, swerving to avoid hitting two people but unavoidably hitting one? (This is the famous “trolley problem” that has vexed philosophers and moral psychologists for over half a century.)

Should your car be programmed to avoid crashing into a group of schoolchildren, even if that means driving you off the side of a cliff? Most of us would opt for maximizing the number of lives saved, except when one of those lives belongs to us or our loved ones.

These are questions that take us to the heart of the moral life in a technological society. They are already part of a lively and nuanced discussion among philosophers, engineers, policy makers and technologists. It is a conversation to which the larger public should be invited.

The ethics of dealing with autonomous systems will be a central issue of the coming decades."