Tuesday, February 6, 2024

Cast as Criminals, America’s Librarians Rally to Their Own Defense; The New York Times, February 3, 2024

 Elizabeth Williamson, The New York Times; Cast as Criminals, America’s Librarians Rally to Their Own Defense

"As America’s libraries have become noisy and sometimes dangerous new battlegrounds in the nation’s culture wars, librarians like Ms. Neujahr and their allies have moved from the stacks to the front lines. People who normally preside over hushed sanctuaries are now battling groups that demand the mass removal of books and seek to control library governance. Last year, more than 150 bills in 35 states aimed to restrict access to library materials, and to punish library workers who do not comply."

Friday, February 2, 2024

European Publishers Praise New EU AI Law; Publishers Weekly, February 2, 2024

Ed Nawotka, Publishers Weekly; European Publishers Praise New EU AI Law

"The Federation of European Publishers (FEP) was quick to praise the passage of new legislation by the European Union that, among its provisions, requires "general purpose AI companies" to respect copyright law and have policies in place to this effect.

FEP officials called the EU Artificial Intelligence (AI) Act, which passed on February 2, the "world’s first concrete regulation of AI," and said that the legislation seeks to "ensure the ethical and human-centric development of this technology and prevent abusive or illegal practices law, which also demands transparency about what data is being used in training the models.""

Thursday, February 1, 2024

Read On: We're Distributing 1,500 Banned Books by Black Authors in Philly This February; Visit Philadelphia, January 31, 2024

Visit Philadelphia; Read On: We're Distributing 1,500 Banned Books by Black Authors in Philly This February

"According to Penn America, more than 30 states have banned certain books by Black authors — both fiction and non-fiction — or otherwise deemed them inappropriate.

During Black History Month and beyond, Philadelphia — the birthplace of American democracy — is making these stories accessible and available to both visitors and residents.

Visit Philadelphia has launched the Little Free(dom) Library initiative in partnership with Little Free Library and the Free Library of Philadelphia, providing resources on their site to help protect everyone’s right to read. The effort encourages visitors and residents to explore Black history and engage with Black narratives by borrowing a banned book by a Black author from one of 13 locations throughout the city. Among them: the Philadelphia Museum of Art, the Betsy Ross House, Franklin Square, Eastern State Penitentiary and the Johnson House Historic Site.

The initiative is launching with a dozen titles and 1,500 books in total. The selections include:

  • The 1619 Project: A New Origin Story by Nikole Hannah-Jones
  • All American Boys by Jason Reynolds
  • All Boys Aren’t Blue by George M. Johnson
  • Beloved by Toni Morrison
  • Between the World and Me by Ta-Nehisi Coates
  • The Fire Next Time by James Baldwin
  • Ghost Boys by Jewell Parker Rhodes
  • Hood Feminism: Notes from the Women That a Movement Forgot by Mikki Kendall
  • Roll of Thunder, Hear My Cry by Mildred D. Taylor
  • Stamped: Racism, Antiracism, and You by Jason Reynolds & Ibram X. Kendi
  • Their Eyes Were Watching God by Zora Neale Hurston
  • The Undefeated by Kwame Alexander"

The economy and ethics of AI training data; Marketplace.org, January 31, 2024

Matt Levin, Marketplace.org;  The economy and ethics of AI training data

"Maybe the only industry hotter than artificial intelligence right now? AI litigation. 

Just a sampling: Writer Michael Chabon is suing Meta. Getty Images is suing Stability AI. And both The New York Times and The Authors Guild have filed separate lawsuits against OpenAI and Microsoft. 

At the heart of these cases is the allegation that tech companies illegally used copyrighted works as part of their AI training data. 

For text focused generative AI, there’s a good chance that some of that training data originated from one massive archive: Common Crawl

“Common Crawl is the copy of the internet. It’s a 17-year archive of the internet. We make this freely available to researchers, academics and companies,” said Rich Skrenta, who heads the nonprofit Common Crawl Foundation."

Wednesday, January 31, 2024

Lawyers viewed as more ethical than car salespeople and US lawmakers; ABA Journal, January 30, 2024

DEBRA CASSENS WEISS, ABA Journal ; Lawyers viewed as more ethical than car salespeople and US lawmakers

"Only 16% of Americans rate lawyers’ honesty and ethical standards as "high" or "very high," according to a Gallup poll taken in December.

The percentage has decreased since 2022, when 21% of Americans said lawyers had high or very high honesty and ethical standards, and since 2019, when the percentage was 22%, according to a Jan. 22 press release with results of Gallup’s 2023 Honesty and Ethics poll.

Lawyers did better than business executives, insurance salespeople and stockbrokers. Twelve percent of Americans viewed those occupations as having high or very high ethics and honesty. The percentage decreased to 8% for advertising practitioners, car salespeople and senators, and 6% for members of Congress."

California copyright-case leaves tattoo artists in limbo; Fox26 Houston, January 29, 2024

 , Fox26 Houston ; California copyright-case leaves tattoo artists in limbo

"Patent and Copyright expert Joh Rizvi, known at The Patent Professor, says the California case never got to the issue of whether images reproduced in tattoos are fair to use as art and expression. 

"What I find is the more interesting question is, 'Is a tattoo different? Is this free speech?'" he wonders.

Fair Use has been the subject of countless lawsuits, and Rizvi says this one leaves artists in a legal gray area, with no precedent."

Tuesday, January 30, 2024

Florida’s New Advisory Ethics Opinion on Generative AI Hits the Mark; JDSupra, January 29, 2024

Ralph Artigliere , JDSupra; Florida’s New Advisory Ethics Opinion on Generative AI Hits the Mark

"As a former Florida trial lawyer and judge who appreciates emerging technology, I admit that I had more than a little concern when The Florida Bar announced it was working on a new ethics opinion on generative AI. Generative AI promises to provide monumental advantages to lawyers in their workflow, quality of work product, productivity, and time management and more. For clients, use of generative AI by their lawyers can mean better legal services delivered faster and with greater economy. In the area of eDiscovery, generative AI promises to surpass technology assisted review in helping manage the increasingly massive amounts of data.

Generative AI is new to the greater world, and certainly to busy lawyers who are not reading every blogpost on AI. The internet and journals are afire over concerns of hallucinations, confidentiality, bias, and the like. I felt a new ethics opinion might throw a wet blanket on generative AI and discourage Florida lawyers from investigating the new technology.

Thankfully, my concerns did not become reality. The Florida Bar took a thorough look at the technology and the existing ethical guidance and law and applied existing guidelines and rules in a thorough and balanced fashion. This article briefly summarizes Opinion 24-1 and highlights some of its important features.

The Opinion

On January 19, 2024, The Florida Bar released Ethics Opinion 24-1(“Opinion 24-1”)regarding the use of generative artificial intelligence (“AI”) in the practice of law. The Florida Bar and the State Bar of California are leaders in issuing ethical guidance on this issue. Opinion 24-1 draws from a solid background of ethics opinions and guidance in Florida and around the country and provides positive as well as cautionary statements regarding the emerging technologies. Overall, the guidance is well-placed and helpful for lawyers at a time when so many are weighing the use of generative AI technology in their law practices."

Lawyers weigh strength of copyright suit filed against BigLaw firm; Rhode Island Lawyers Weekly, January 29, 2024

 Pat Murphy , Rhode Island Lawyers Weekly; Lawyers weigh strength of copyright suit filed against BigLaw firm

"Jerry Cohen, a Boston attorney who teaches IP law at Roger Williams University School of Law, called the suit “not so much a copyright case as it is a matter of professional responsibility and respect.”"

Where's the best place to find a robot cat? The library, of course; ZDNet, January 27, 2024

Chris Matyszczyk, , ZDNet; Where's the best place to find a robot cat? The library, of course

"As Oregon Public Broadcasting (OPB) reported, the library's customers are involved in a festival of adoration when it comes to these three black-and-white robot felines...

Here's Manistee County Library in Michigan with a veritable array of robotic pets. Cats, dogs and even a bird...

Let's now drift to the Hastings Public Library, also in Michigan. There, just beneath Botley the Coding Robot is: "Robotic Cat. Coming January 2024."

Now you might be wondering what the rules are for going to your local public library and taking a robot cat home with you.

Helpfully, the Reading Public Library in Massachusetts offers some guidelines...

It seems, then, that America's libraries have become homes for robot cats. They bring peace and companionship to many. And that's a good thing."

Monday, January 29, 2024

From Our Fellows – From Automation to Agency: The Future of AI Ethics Education; Center for Democracy & Technology (CDT), January 29, 2024

Ashley LeeBerkman Klein Center for Internet and Society Affiliate, Harvard University and CDT Non-Resident Fellow alum, and Victoria HsiehComputer Science Undergraduate, Stanford University; Center for Democracy & Technology (CDT) ; From Our Fellows –  From Automation to Agency: The Future of AI Ethics Education

"Disclaimer: The views expressed by CDT’s Non-Resident Fellows and any coauthors are their own and do not necessarily reflect the policy, position, or views of CDT...

AI ethics education can play a significant role in empowering students to collectively reimagine AI practices and processes, and contribute to a cultural transformation that prioritizes ethical and responsible AI."

Saturday, January 27, 2024

Library Copyright Alliance Principles for Copyright and Artificial Intelligence; Library Copyright Alliance (LCA), American Library Association (ALA), Association of Research Libraries (ARL), July 10, 2023

Library Copyright Alliance (LCA), American Library Association (ALA), Association of Research Libraries (ARL); Library Copyright Alliance Principles for Copyright and Artificial Intelligence

"The existing U.S. Copyright Act, as applied and interpreted by the Copyright Office and the courts, is fully capable at this time to address the intersection of copyright and AI without amendment.

  • Based on well-established precedent, the ingestion of copyrighted works to create large language models or other AI training databases generally is a fair use.

    • Because tens—if not hundreds—of millions of works are ingested to create an LLM, the contribution of any one work to the operation of the LLM is de minimis; accordingly, remuneration for ingestion is neither appropriate nor feasible.

    • Further, copyright owners can employ technical means such as the Robots Exclusion Protocol to prevent their works from being used to train AIs.

  • If an AI produces a work that is substantially similar in protected expression to a work that was ingested by the AI, that new work infringes the copyright in the original work.

• If the original work was registered prior to the infringement, the copyright owner of the original work can bring a copyright infringement action for statutory damages against the AI provider and the user who prompted the AI to produce the substantially similar work.

• Applying traditional principles of human authorship, a work that is generated by an AI might be copyrightable if the prompts provided by the user sufficiently controlled the AI such that the resulting work as a whole constituted an original work of human authorship.

AI has the potential to disrupt many professions, not just individual creators. The response to this disruption (e.g., not be treated as a means for addressing these broader societal challenges. support for worker retraining through institutions such as community colleges and public libraries) should be developed on an economy-wide basis, and copyright law should not be treated as a means for addressing these broader societal challenges.

AI also has the potential to serve as a powerful tool in the hands of artists, enabling them to express their creativity in new and efficient ways, thereby furthering the objectives of the copyright system."

Training Generative AI Models on Copyrighted Works Is Fair Use; ARL Views, January 23, 2024

 Katherine Klosek, Director of Information Policy and Federal Relations, Association of Research Libraries (ARL), and Marjory S. Blumenthal, Senior Policy Fellow, American Library Association (ALA) Office of Public Policy and Advocacy |, ARL Views; Training Generative AI Models on Copyrighted Works Is Fair Use

"In a blog post about the case, OpenAI cites the Library Copyright Alliance (LCA) position that “based on well-established precedent, the ingestion of copyrighted works to create large language models or other AI training databases generally is a fair use.” LCA explained this position in our submission to the US Copyright Office notice of inquiry on copyright and AI, and in the LCA Principles for Copyright and AI.

LCA is not involved in any of the AI lawsuits. But as champions of fair use, free speech, and freedom of information, libraries have a stake in maintaining the balance of copyright law so that it is not used to block or restrict access to information. We drafted the principles on AI and copyright in response to efforts to amend copyright law to require licensing schemes for generative AI that could stunt the development of this technology, and undermine its utility to researchers, students, creators, and the public. The LCA principles hold that copyright law as applied and interpreted by the Copyright Office and the courts is flexible and robust enough to address issues of copyright and AI without amendment. The LCA principles also make the careful and critical distinction between input to train an LLM, and output—which could potentially be infringing if it is substantially similar to an original expressive work.

On the question of whether ingesting copyrighted works to train LLMs is fair use, LCA points to the history of courts applying the US Copyright Act to AI."

Richard Prince to Pay Photographers Who Sued Over Copyright; The New York Times, January 26, 2024

Matt Stevens, The New York Times; Richard Prince to Pay Photographers Who Sued Over Copyright

"The artist Richard Prince agreed to pay at least $650,000 to two photographers whose images he had incorporated in his own work, ending a long-running copyright dispute that had been closely monitored by the art world...

Brian Sexton, a lawyer for Prince, said the artist wanted to protect free expression and have copyright law catch up to changing technology...

Marriott said the judgments showed that copyright law still provided meaningful protection to creators and that the internet was not a copying free-for-all.

“There is not a fair use exception to copyright law that applies to the famous and another that applies to everyone else,” he said."

Artificial Intelligence Law - Intellectual Property Protection for your voice?; JDSupra, January 22, 2024

 Steve Vondran, JDSupra ; Artificial Intelligence Law - Intellectual Property Protection for your voice?

"With the advent of AI technology capable of replicating a person's voice and utilizing it for commercial purposes, several key legal issues are likely to emerge under California's right of publicity law. The right of publicity refers to an individual's right to control and profit from their own name, image, likeness, or voice.

Determining the extent of a person's control over their own voice will likely become a contentious legal matter given the rise of AI technology. In 2024, with a mere prompt and a push of a button, a creator can generate highly accurate voice replicas, potentially allowing companies to utilize a person's voice without their explicit permission for example using a AI generated song in a video, or podcast, or using it as a voice-over for a commercial project. This sounds like fun new technology, until you realize that in states like California where a "right of publicity law" exists a persons VOICE can be a protectable asset that one can sue to protect others who wrongfully misuse their voice for commercial advertising purposes.

This blog will discuss a few new legal issues I see arising in our wonderful new digital age being fueled by the massive onset of Generative AI technology (which really just means you input prompts into an AI tool and it will generate art, text, images, music, etc."

Friday, January 26, 2024

‘Who Owns This Sentence?’ Increasingly, Who Knows?; The New York Times, January 24, 2024

 Alexandra Jacobs, The New York Times ; ‘Who Owns This Sentence?’ Increasingly, Who Knows?

"David Bellos and Alexandre Montagu’s surprisingly sprightly history “Who Owns This Sentence?” arrives with uncanny timing...

They sort out the difference between plagiarism, a matter of honor debated since ancient times (and a theme, tellingly, of many recent novels); copyright, a concern of modern law and, crucially, lucre (“the biggest money machine the world has ever seen”); and trademark. If I wanted a picture of Smokey Bear to run with this article, for instance — and I very much do — The New York Times would have to fork up."...

They themselves have a wry way with technical material; this is less Copyright for Dummies, like that endlessly extended, imitatedand spoofed series, than for wits. Discouraged by their publisher from naming a chapter title after the Beatles’ “All You Need Is Love,” the authors deftly illustrate this “absurd” circumstance by only describing in close identifiable detail the band and the song."

A Stranger Bought a Set of Highly Personal Letters. Can I Call Him Out?; The Ethicist, The New York Times Magazine, January 25, 2024

  Kwame Anthony Appiah, The Ethicist,The New York Times Magazine ; A Stranger Bought a Set of Highly Personal Letters. Can I Call Him Out?

"From the Ethicist:

It was thoughtless, I agree, to sell off a cache of letters that included some that were intimate and came from living people. The thought of strangers’ digging through letters written in the spirit of love and friendship can be upsetting. That the person who has acquired these letters has failed to grasp this suggests a certain lack of empathy. But it doesn’t establish that he lacks a moral sense, because you don’t really have any idea what he plans to do with this material. 

And there are constraints on this. When you acquire letters, you don’t thereby acquire the copyright in those letters, and copyright protection typically lasts until 70 years after the author’s death. So he has to deal with the murky issue of what counts as the “fair use” of such intellectual property. There are also a few privacy torts that individuals can try to pursue in the courts (e.g., intrusion upon seclusion; public disclosure of private facts). Even though he isn’t a party to a covenant of confidentiality, as someone in A.A. is, it remains true that, as you imply, exposing details of the intimate lives of private people is generally wrong."

George Carlin Estate Sues Creators of AI-Generated Comedy Special in Key Lawsuit Over Stars’ Likenesses; The Hollywood Reporter, January 25, 2024

 Winston Cho, The Hollywood Reporter ; George Carlin Estate Sues Creators of AI-Generated Comedy Special in Key Lawsuit Over Stars’ Likenesses

"The complaint seeks a court order for immediate removal of the special, as well as unspecified damages. It’s among the first legal actions taken by the estate of a deceased celebrity for unlicensed use of their work and likeness to manufacture a new, AI-generated creation and was filed as Hollywood is sounding the alarm over utilization of AI to impersonate people without consent or compensation...

According to the complaint, the special was created through unauthorized use of Carlin’s copyrighted works.

At the start of the video, it’s explained that the AI program that created the special ingested five decades of Carlin’s original stand-up routines, which are owned by the comedian’s estate, as training materials, “thereby making unauthorized copies” of the copyrighted works...

If signed into law, the proposal, called the No AI Fraud Act, could curb a growing trend of individuals and businesses creating AI-recorded tracks using artists’ voices and deceptive ads in which it appears a performer is endorsing a product. In the absence of a federal right of publicity law, unions and trade groups in Hollywood have been lobbying for legislation requiring individuals’ consent to use their voice and likeness."

The Sleepy Copyright Office in the Middle of a High-Stakes Clash Over A.I.; The New York Times, January 25, 2024

  Cecilia Kang, The New York Times; The Sleepy Copyright Office in the Middle of a High-Stakes Clash Over A.I.

"For decades, the Copyright Office has been a small and sleepy office within the Library of Congress. Each year, the agency’s 450 employees register roughly half a million copyrights, the ownership rights for creative works, based on a two-centuries-old law.

In recent months, however, the office has suddenly found itself in the spotlight. Lobbyists for Microsoft, Google, and the music and news industries have asked to meet with Shira Perlmutter, the register of copyrights, and her staff. Thousands of artists, musicians and tech executives have written to the agency, and hundreds have asked to speak at listening sessions hosted by the office.

The attention stems from a first-of-its-kind review of copyright law that the Copyright Office is conducting in the age of artificial intelligence. The technology — which feeds off creative content — has upended traditional norms around copyright, which gives owners of books, movies and music the exclusive ability to distribute and copy their works.

The agency plans to put out three reports this year revealing its position on copyright law in relation to A.I. The reports are set to be hugely consequential, weighing heavily in courts as well as with lawmakers and regulators."

Wednesday, January 24, 2024

Ethics Ratings of Nearly All Professions Down in U.S.; Gallup, January 22, 2024

 MEGAN BRENAN AND JEFFREY M. JONES, Gallup; Ethics Ratings of Nearly All Professions Down in U.S.

"Americans’ ratings of nearly all 23 professions measured in Gallup’s 2023 Honesty and Ethics poll are lower than they have been in recent years. Only one profession -- labor union leaders -- has not declined since 2019, yet a relatively low 25% rate their honesty and ethics as “very high” or “high.”

Nurses remain the most trusted profession, with 78% of U.S. adults currently believing nurses have high honesty and ethical standards. However, that is down seven percentage points from 2019 and 11 points from its peak in 2020.

At the other end of the spectrum, members of Congress, senators, car salespeople and advertising practitioners are viewed as the least ethical, with ratings in the single digits that have worsened or remained flat."

Is A.I. the Death of I.P.?; The New Yorker, January 15, 2024

 Louis Menand, The New Yorker ; Is A.I. the Death of I.P.?

"Intellectual property accounts for some or all of the wealth of at least half of the world’s fifty richest people, and it has been estimated to account for fifty-two per cent of the value of U.S. merchandise exports. I.P. is the new oil. Nations sitting on a lot of it are making money selling it to nations that have relatively little. It’s therefore in a country’s interest to protect the intellectual property of its businesses.

But every right is also a prohibition. My right of ownership of some piece of intellectual property bars everyone else from using that property without my consent. I.P. rights have an economic value but a social cost. Is that cost too high?

I.P. ownership comes in several legal varieties: copyrights, patents, design rights, publicity rights, and trademarks."

Wednesday, January 10, 2024

Addressing equity and ethics in artificial intelligence; American Psychological Association, January 8, 2024

 Zara Abrams, American Psychological Association; Addressing equity and ethics in artificial intelligence

"As artificial intelligence (AI) rapidly permeates our world, researchers and policymakers are scrambling to stay one step ahead. What are the potential harms of these new tools—and how can they be avoided?

“With any new technology, we always need to be thinking about what’s coming next. But AI is moving so fast that it’s difficult to grasp how significantly it’s going to change things,” said David Luxton, PhD, a clinical psychologist and an affiliate professor at the University of Washington’s School of Medicine who is part of a session at the upcoming 2024 Consumer Electronics Show (CES) on Harnessing the Power of AI Ethically.

Luxton and his colleagues dubbed recent AI advances “super-disruptive technology” because of their potential to profoundly alter society in unexpected ways. In addition to concerns about job displacement and manipulation, AI tools can cause unintended harm to individuals, relationships, and groups. Biased algorithms can promote discrimination or other forms of inaccurate decision-making that can cause systematic and potentially harmful errors; unequal access to AI can exacerbate inequality (Proceedings of the Stanford Existential Risk Conference 2023, 60–74). On the flip side, AI may also hold the potential to reduce unfairness in today’s world—if people can agree on what “fairness” means.

“There’s a lot of pushback against AI because it can promote bias, but humans have been promoting biases for a really long time,” said psychologist Rhoda Au, PhD, a professor of anatomy and neurobiology at the Boston University Chobanian & Avedisian School of Medicine who is also speaking at CES on harnessing AI ethically. “We can’t just be dismissive and say: ‘AI is good’ or ‘AI is bad.’ We need to embrace its complexity and understand that it’s going to be both.”"

"Stories Are Just Something That Can Be Eaten by an AI": Marvel Lashes Out at AI Content with a Mind-Blowing X-Men Twist; ScreenRant, January 9, 2024

TRISTAN BENNS, ScreenRant; "Stories Are Just Something That Can Be Eaten by an AI": Marvel Lashes Out at AI Content with a Mind-Blowing X-Men Twist

"Realizing the folly of her actions, Righteous laments her weakness against Enigma as a creature of stories, saying that “Stories are just something that can be eaten by an A.I. to make it more powerful. The only good story is a story that has been entirely and totally consumed and exploited.”.

While this isn’t the mutants’ first battle against artificial intelligence, this pointed statement has some sobering real-world applications. Since the Krakoan Age began, it’s been clear mutantkind's greatest battle would be against the concept of artificial intelligence as the final evolution of “life” in the Marvel Universe. With entities like Nimrod and the Omega Sentinel steering the forces of Orchis and other enemies of the X-Men against the mutant nation, this conflict has been painted as the ultimate fight for survival for mutants. However, with Enigma’s ultimate triumph over even the power of storytelling, it is clear that the X-Men aren’t just facing a comic’s interpretation of artificial intelligence – they’re battling the death of imagination.

In this way, the X-Men’s ultimate battle parallels a very real-world problem that both fans and creators must confront: the act of true creation versus the effects of generative artificial intelligence."

Lifecycle of Copyright: 1928 Works in the Public Domain; Library of Congress Blogs: Copyright Creativity at Work, January 8, 2024

 Alison Hall , Library of Congress Blogs: Copyright Creativity at Work; Lifecycle of Copyright: 1928 Works in the Public Domain

"This blog also includes contributions from Jessica Chinnadurai, attorney-advisor, and Rafael Franco, writer-editor intern in the Copyright Office.

Over the last several years, we have witnessed a new class of creative works entering the public domain in the United States each January 1. This year, a variety of works published in 1928, ranging from motion pictures to music to books, joined others in the public domain. The public domain has important historical and cultural benefits in the lifecycle of copyright...

Below are just a few of the historical and cultural works that entered the public domain in 2024."

Tuesday, January 9, 2024

How John Deere Hijacked Copyright Law To Keep You From Tinkering With Your Tractor; Reason, January 8, 2024

  , Reason; How John Deere Hijacked Copyright Law To Keep You From Tinkering With Your Tractor

"For nearly 25 years, Section 1201 has been hanging over the developers and distributors of tools that give users more control over the products they own. The ways in which John Deere and other corporations have used the copyright system is a glaring example of regulatory capture in action, highlighting the absurdity of a system where owning a product doesn't necessarily convey the right to fully control it. There are certainly circumstances where the manufacturers are justified in protecting their products from tampering, but such cases should be handled through warranty nullification and contract law, not through exorbitant fines and lengthy prison sentences."

Saturday, January 6, 2024

Addressing the epidemic of high drug prices; Harvard Law Today, January 5, 2024

Jeff Neal, Harvard Law Today; Addressing the epidemic of high drug prices

"The Biden administration is once again targeting high drug prices paid by Americans. This time, officials are focused on prescription medications developed with federal tax dollars. The United States government, through the National Institutes of Health (NIH), awards billions of dollars of research grants to university scientists each year to fund biomedical research, which is often patented. The universities in turn grant exclusive licenses to companies to produce and sell the resulting drugs to patients in need. But what happens if a drug company fails to make a medication available, or sets its price so high that it is out of reach for a significant percentage of patients?

To tackle this problem, the Biden administration recently released a “proposed framework” that specifies when and how the NIH can “march in” and award the rights to produce a patented drug to a third party if the patent licensee does not make it available to the public on “reasonable terms.” The plan is based on a provision included in the Bayh-Dole Act, a 1980 federal law which was designed to stimulate innovation by encouraging universities to obtain and license patents for inventions resulting from federally funded research.

According to Harvard Law School intellectual property expert Ruth Okediji LL.M. ’91, S.J.D. ’96, although the Biden administration’s proposed framework for using government march-in rights to lower drug costs is an important development, whether it will be successfully implemented and result in meaningful drug price reductions remains to be seen. Harvard Law Today recently spoke to Okediji, the Jeremiah Smith, Jr. Professor of Law and faculty director of Global Access in Action(GAiA) at the Berkman Klein Center, about the new proposal and the legal challenges it might face."

AI’s future could hinge on one thorny legal question; The Washington Post, January 4, 2024

 

, The Washington Post; AI’s future could hinge on one thorny legal question

"Because the AI cases represent new terrain in copyright law, it is not clear how judges and juries will ultimately rule, several legal experts agreed...

“Anyone who’s predicting the outcome is taking a big risk here,” Gervais said...

Cornell’s Grimmelmann said AI copyright cases might ultimately hinge on the stories each side tells about how to weigh the technology’s harms and benefits.

“Look at all the lawsuits, and they’re trying to tell stories about how these are just plagiarism machines ripping off artists,” he said. “Look at the [AI firms’ responses], and they’re trying to tell stories about all the really interesting things these AIs can do that are genuinely new and exciting.”"

Wednesday, January 3, 2024