Showing posts with label law. Show all posts
Showing posts with label law. Show all posts

Friday, June 14, 2024

Ethical considerations for the age of non-governmental space exploration; Nature, June 11, 2024

Nature; Ethical considerations for the age of non-governmental space exploration

"Abstract

Mounting ambitions and capabilities for public and private, non-government sector crewed space exploration bring with them an increasingly diverse set of space travelers, raising new and nontrivial ethical, legal, and medical policy and practice concerns which are still relatively underexplored. In this piece, we lay out several pressing issues related to ethical considerations for selecting space travelers and conducting human subject research on them, especially in the context of non-governmental and commercial/private space operations."

Saturday, April 6, 2024

Where AI and property law intersect; Arizona State University (ASU) News, April 5, 2024

  Dolores Tropiano, Arizona State University (ASU) News; Where AI and property law intersect

"Artificial intelligence is a powerful tool that has the potential to be used to revolutionize education, creativity, everyday life and more.

But as society begins to harness this technology and its many uses — especially in the field of generative AI — there are growing ethical and copyright concerns for both the creative industry and legal sector.

Tyson Winarski is a professor of practice with the Intellectual Property Law program in Arizona State University’s Sandra Day O’Connor College of Law. He teaches an AI and intellectual property module within the course Artificial Intelligence: Law, Ethics and Policy, taught by ASU Law Professor Gary Marchant.

“The course is extremely important for attorneys and law students,” Winarski said. “Generative AI is presenting huge issues in the area of intellectual property rights and copyrights, and we do not have definitive answers as Congress and the courts have not spoken on the issue yet.”"

Monday, May 1, 2023

Generative AI: Ethical, Legal, and Technical Questions; Markkula Center for Applied Ethics, Santa Clara University, Tuesday, May 16, 2023 12 Noon Pacific/3 PM Eastern

  

Join us May 16th at noon for an online panel discussion on ethical, legal, and technical questions related to generative AI.

Generative AI: Ethical, Legal, and Technical Questions

Generative AI: Ethical, Legal, and Technical Questions

 
Noon to 1:00 p.m. Pacific
Tuesday, May 16, 2023
 

"As artists, composers, and other “content creators” and intellectual property owners use generative AI tools or decry their development, many legal and ethical issues arise. In this panel discussion, a copyright law expert, an AI researcher who is also a composer and music performer, and a multi-disciplinary visual artist (all of whom teach at Santa Clara University) will address some of those questions–from training data collection to fair use, impact on creativity and creative labor, the balancing of various rights, and our ability to assess and respond to fast-moving technologies."

Register to Attend the Webinar

Tuesday, December 20, 2022

Age Verification Online: Ethical, Legal, and Technical Considerations; Markkula Center for Applied Ethics, Santa Clara University, December 2022

Irina Raicu, Markkula Center for Applied Ethics, Santa Clara University ; Age Verification Online: Ethical, Legal, and Technical Considerations

"On December 1st, as part of the “IT, Ethics, and Law” lecture series (co-sponsored by MCAE and the High Tech Law Institute), I moderated an online panel; the title of our event was “Determining Users’ Ages Online: Ethical, Legal, and Technical Considerations.” The panelists were Eric Goldman (from Santa Clara University’s School of Law), Jennifer King (from the Stanford University Institute for Human-Centered Artificial Intelligence), and Sarah Krehbiel (from SCU’s Computer Science department). You can now watch the recording of that conversation at https://www.youtube.com/watch?v=DPqX2I98eL0"

Wednesday, March 9, 2022

An Upcoming Webinar on AR/VR, Ethics, and Law; Markkula Center for Applied Ethics at Santa Clara University, Thursday, March 10, 2022 12 Noon PST/3 PM EST

Irina Raicu, Markkula Center for Applied Ethics at Santa Clara University; An Upcoming Webinar on AR/VR, Ethics, and Law

Thursday, March 10, 2022 12 Noon PST/3 PM EST

"Irina Raicu is the director of the Internet Ethics program (@IEthics) at the Markkula Center for Applied Ethics. Views are her own.

On Thursday, March 10, we will be hosting an online webinar titled “Virtual Reality, Real Virtues, and Augmented Norms and Laws.” It’s not too late to register and add your questions to this conversation about ethical and legal issues associated with the widening adoption of VR and AR technology!

The panelists presenting will be attorney Brittan Heller—who advises companies on issues such as privacy, content moderation, online harassment, and civic engagement; was the founding director of the Anti-Defamation League’s Center on Technology and Society; and previously worked for the International Criminal Court and the U.S. Department of Justice’s Criminal Division—and Erick Ramirez, who is an associate professor in Santa Clara University’s Philosophy department, a co-creator of VR analogs of well-known philosophical thought experiments, and the author of a recently published textbook titled The Ethics of Virtual and Augmented Reality: Building Worlds.

The event is part of Santa Clara University’s “IT, Ethics, and Law” lecture series, which is co-sponsored by the Markkula Center for Applied Ethics and the High Tech Law Institute. (Please join our Internet/Technology ethics mailing list if you’d like to be notified of future events in the series.)

For a preview of some of the issues likely to be mentioned in our conversation, see Heller’s 2020 report titled “Reimagining Reality: Human Rights and Immersive Technology,” and Ramirez’s “It’s Dangerous to Think Virtual Reality Is an Empathy Machine.”

Whether or not VR and AR can serve as “empathy machines,” constitute “the future of corporate training,” create new “kids' issues,” require new standards of privacy protection, or do all of the above and more, they are bound to require careful analysis and ongoing conversations."

Wednesday, May 12, 2021

Want to Get Along With Robots? Pretend They’re Animals; Wired, April 19, 2021

 , Wired; Want to Get Along With Robots? Pretend They’re Animals

Robotics ethicist Kate Darling surveys our history with animals—in work, war, and companionship—to show how we might develop similar relationships with robots.


"WIRED: That brings us nicely to the idea of agency. One of my favorite moments in human history was when animals were put on trial—like regularly.

KD: Wait. You liked this?

WIRED: I mean, it's horrifying. But I just think that it's a fascinating period in legal history. So why do we ascribe this agency to animals that have no such thing? And why might we do the same with robots?

KD: It's so bizarre and fascinating—and seems so ridiculous to us now—but for hundreds of years of human history in the Middle Ages, we put animals on trial for the crimes they committed. So whether that was a pig that chewed a child's ear off, or whether that was a plague of locusts or rats that destroyed crops, there were actual trials that progressed the same way that a trial for a human would progress, with defense attorneys and a jury and summoning the animals to court. Some were not found guilty, and some were sentenced to death. It’s this idea that animals should be held accountable, or be expected to abide by our morals or rules. Now we don't believe that that makes any sense, the same way that we wouldn't hold a small child accountable for everything.

In a lot of the early legal conversation around responsibility in robotics, it seems that we're doing something a little bit similar. And, this is a little tongue in cheek—but also not really—because the solutions that people are proposing for robots causing harm are getting a little bit too close to assigning too much agency to the robots. There's this idea that, “Oh, because nobody could anticipate this harm, how are we going to hold people accountable? We have to hold the robot itself accountable.” Whether that's by creating some sort of legal entity, like a corporation, where the robot has its own rights and responsibilities, or whether that's by programming the robot to obey our rules and morals—which we kind of know from the field of machine ethics is not really possible or feasible, at least not any anytime soon."

Tuesday, January 19, 2021

Why Ethics Matter For Social Media, Silicon Valley And Every Tech Industry Leader; Forbes, January 14, 2021

Rob Dube, Forbes; Why Ethics Matter For Social Media, Silicon Valley And Every Tech Industry Leader

"At one time, the idea of technology and social media significantly influencing society and politics would’ve sounded crazy. Now, with technology so embedded into the fabric of our lives, it’s a reality that raises legitimate questions about Silicon Valley’s ethical responsibility. 

Should tech companies step in to create and enforce guidelines within their platforms if they believe such policies would help the greater good? Or should leaders allow their technology to evolve organically without filters or manipulation? 

One authority on this fascinating topic is Casey Fiesler—a researcher, assistant professor at the University of Colorado Boulder, and expert on tech ethics. She is also a graduate of Vanderbilt Law School. There, she found a passion for the intersections between law, ethics, and technology."

Thursday, April 18, 2019

Ethics Alone Can’t Fix Big Tech Ethics can provide blueprints for good tech, but it can’t implement them.; Slate, April 17, 2019

Daniel Susser, Slate;

Ethics Alone Can’t Fix Big Tech


Ethics can provide blueprints for good tech, but it can’t implement them.



"Ethics requires more than rote compliance. And it’s important to remember that industry can reduce any strategy to theater. Simply focusing on law and policy won’t solve these problems, since they are equally (if not more) susceptible to watering down. Many are rightly excited about new proposals for state and federal privacy legislation, and for laws constraining facial recognition technology, but we’re already seeing industry lobbying to strip them of their most meaningful provisions. More importantly, law and policy evolve too slowly to keep up with the latest challenges technology throws at us, as is evident from the fact that most existing federal privacy legislation is older than the internet.

The way forward is to see these strategies as complementary, each offering distinctive and necessary tools for steering new and emerging technologies toward shared ends. The task is fitting them together.

By its very nature ethics is idealistic. The purpose of ethical reflection is to understand how we ought to live—which principles should drive us and which rules should constrain us. However, it is more or less indifferent to the vagaries of market forces and political winds. To oversimplify: Ethics can provide blueprints for good tech, but it can’t implement them. In contrast, law and policy are creatures of the here and now. They aim to shape the future, but they are subject to the brute realities—social, political, economic, historical—from which they emerge. What they lack in idealism, though, is made up for in effectiveness. Unlike ethics, law and policy are backed by the coercive force of the state."

Wednesday, March 13, 2019

College cheating scandal is the tip of the iceberg; CNN, March 12, 2019

David Perry, CNN; College cheating scandal is the tip of the iceberg

"We're not talking about donating a building, we're talking about fraud," said Andrew Lelling, the US Attorney for Massachusetts, as he announced indictments in a massive scheme alleging that celebrities and other wealthy individuals used cheating, bribes, and lies to get their kids into elite colleges.

The behavior described in this alleged fraud should be punished. But on a broader and more basic level, the case also sheds light on deep inequities in our college admissions system. Because if someone can get their kid into Harvard by buying a building, let alone by committing any of the alleged acts emerging from this case, the scandal isn't just what's illegal, but what's legal as well. "

Thursday, January 24, 2019

I Found $90 in the Subway. Is It Yours?; The New York Times, January 24, 2019

Niraj Chokshi, The New York Times; I Found $90 in the Subway. Is It Yours?

"As I got off a train in Manhattan on Wednesday, I paid little attention to a flutter out of the corner of my eye on the subway. Then another passenger told me that I had dropped some money.

“That isn’t mine,” I told her as I glanced at what turned out to be $90 on the ground.

I realized the flutter had been the money falling out of the coat of a man standing near me who had just stepped off the train.

The doors were about to close, and no one was acting, so I grabbed the cash and left the train. But I was too late. The man had disappeared into the crowd. I waited a few minutes to see if he would return, but he was long gone. I tried to find a transit employee or police officer, but none were in sight.

I was running late, so I left. But now what? What are you supposed to do with money that isn’t yours?"

Monday, August 22, 2016

The Difference between Copyright Infringement and Plagiarism—and Why It Matters; Library Journal, 8/17/16

Rick Anderson, Library Journal; The Difference between Copyright Infringement and Plagiarism—and Why It Matters:
"TELLING THE DIFFERENCE
If you were to take Alice’s Adventures in Wonderland, change the title and the characters’ names, and pass it off as your original work, that would be plagiarism. However, there would be no copyright infringement, because Alice’s Adventures in Wonderland is in the public domain and therefore no longer subject to copyright.
On the other hand, if you were to take 50 Shades of Grey—a work currently in copyright—change the title and the characters’ names, and pass it off as your original work, that would constitute both plagiarism and copyright infringement. Stealing the author’s work in this way and selling an unauthorized derivative of it would not only be unethical; it would also be illegal.
Under U.S. law, it might be an example of stealing that rises to the level of a felony punishable by imprisonment, depending on its demonstrable financial impact on the legitimate rights holder."