Thursday, July 20, 2023

Llama 2: why is Meta releasing open-source AI model and are there any risks?; The Guardian, July 20, 2023

  , The Guardian; Llama 2: why is Meta releasing open-source AI model and are there any risks?

"Are there concerns about open-source AI?

Tech professionals including Elon Musk, a co-founder of OpenAI, have expressed concerns about an AI arms race. Open-sourcing makes a powerful tool in this technology available to all.

Dame Wendy Hall, regius professor of computer science at the University of Southampton, told the Today programme there were questions over whether the tech industry could be trusted to self-regulate LLMs, with the problem looming even larger for open-source models. “It’s a bit like giving people a template to build a nuclear bomb,” she said.

Dr Andrew Rogoyski, of the Institute for People-Centred AI at the University of Surrey, said open-source models were difficult to regulate. “You can’t really regulate open source. You can regulate the repositories, like Github or Hugging Face, under local legislation,” he said.

“You can issue licence terms on the software that, if abused, could make the abusing company liable under various forms of legal redress. However, being open source means anyone can get their hands on it, so it doesn’t stop the wrong people grabbing the software, nor does it stop anyone from misusing it.”"

Editorial: The Supreme Court has ignored ethics oversight. Time for Congress to act; Los Angeles Times, July 20, 2023

THE TIMES EDITORIAL BOARD , Los Angeles Times; Editorial: The Supreme Court has ignored ethics oversight. Time for Congress to act

"Congress must begin the process of enacting reforms that the court itself has refused to undertake."

Wednesday, July 19, 2023

Stanford president to resign over concerns about integrity of his research; The Guardian, July 19, 2023

Guardian staff and agency , The Guardian; Stanford president to resign over concerns about integrity of his research

"The president of Stanford University, Marc Tessier-Lavigne, has announced he will resign after concerns about the integrity of his research."

‘It was as if my father were actually texting me’: grief in the age of AI; The Guardian, July 18, 2023

 Aimee Pearcy, The Guardian; ‘It was as if my father were actually texting me’: grief in the age of AI

"For all the advances in medicine and technology in recent centuries, the finality of death has never been in dispute. But over the past few months, there has been a surge in the number of people sharing their stories of using ChatGPT to help say goodbye to loved ones. They raise serious questions about the rights of the deceased, and what it means to die. Is Henle’s AI mother a version of the real person? Do we have the right to prevent AI from approximating our personalities after we’re gone? If the living feel comforted by the words of an AI bot impersonation – is that person in some way still alive?"

Tuesday, July 18, 2023

I Am Being Pushed Out of One of the Last Public Squares, the Library; The New York Times, July 17, 2023

 Emily St. James, The New York Times; I Am Being Pushed Out of One of the Last Public Squares, the Library

"The library’s predominant role in our culture is to provide a place to find the information or art you are seeking. But it has another role that’s equally important: a place where everyone is welcome.

When you step inside a library, you are confronted with a wealth of information some other people have curated for you, just by being there."

The Biden administration is without a confirmed ethics czar; Government Executive, July 18, 2023

Eric Katz, Government Executive; The Biden administration is without a confirmed ethics czar

"The lack of a confirmed director should not hinder the Office of Government Ethics' daily operations, but Biden would be smart to pick a new permanent leader soon to signal he is serious about ethics, former agency officials said. 

Emory Rounds, a President Trump appointee whose term carried over into Biden’s tenure, stepped down July 12 when his term expired. Shelley Finlayson, chief of staff and program counsel at the ethics agency, will fill in on an acting basis. 

The agency is responsible for overseeing the ethics plans at each executive branch department and collecting and approving disclosure documents from the federal officials who are required to submit them. It occasionally issues new ethics regulations to update policies and provides guidance and reminders to employees across government."

The Copyright Office Hears from Stakeholders on Important Issues with AI and Copyright; Library of Congress, Copyright Creativity at Work, July 18, 2023

  Nora Scheland, Library of Congress, Copyright Creativity at Work; The Copyright Office Hears from Stakeholders on Important Issues with AI and Copyright

"Over the past two months, the Copyright Office hosted four public, virtual listening sessions on the use of artificial intelligence to generate creative works. The listening sessions focused on literary works, including print journalism and software; visual arts; audiovisual works, including video games; and music and sound recordings. Artists, creators, AI developers, researchers, lawyers, academics, and more shared their goals, concerns, and experiences related to the use and impact of generative AI.

As we look back on four vibrant sessions, we wanted to share some highlights for those who were not able to join live.

Register of Copyrights Shira Perlmutter opened the first listening session, on literary works, with a set of guiding questions for all four sessions: “How does current law apply? Should it be changed? [H]ow will the copyright community, from creators to users, be impacted?” She also reminded the audience that the Copyright Office plays a role “both in addressing practical concerns and in advising on policy.”

Following Register Perlmutter’s introductory remarks, participants spoke on two consecutive panels in which they articulated a wide-ranging set of perspectives.

The remaining listening sessions followed a similar format, and some included additional opportunities for comments without further discussion. Copyright Office staff moderated the listening sessions, and Associate Register of Copyrights and Director of Policy and International Affairs Maria Strong and General Counsel and Associate Register of Copyrights Suzy Wilson each made remarks.

At the final listening session, Register Perlmutter observed some of the themes of the series, including that

  • there is disagreement about whether, or under what circumstances, training generative AI on copyrighted works could be considered fair use;
  • there is considerable interest in developing methods to enhance transparency and education regarding how generative AI produces works, including the possibility of tracking relationships between ingested works and outputs, and understanding how assistive AI is used as a tool in the creation process; and
  • many stakeholders still have questions about the Office’s registration guidance for works containing AI-generated material and would like more details and more examples of how the Office will approach applications for such works.

Throughout all four listening sessions, the Office heard from a broad and diverse group of stakeholders, experts, and creatives, including some who do not typically participate in Office roundtables. Among the speakers were a professor of computer and information science; several Academy Award-nominated artists; attorneys for major private actors, including tech companies and music streaming platforms; representatives from various unions, guilds, and trade groups; and independent visual artists, filmmakers, and composers.

The listening sessions broke registration and attendance records for Copyright Office events. Over 4,100 people tuned in over the course of the four sessions. In her final remarks, Register Perlmutter thanked all the panelists for sharing their insights and the public for tuning in. “The Office appreciates the high level of public engagement with these listening sessions,” said Register Perlmutter. “This interest is of course a reflection of the astonishing potential of artificial intelligence, and the impact that its already having in our lives and on society as a whole.”

The feedback and comments provided to the Office during the listening sessions will help guide the next steps in the Office’s AI initiative. The Office is drafting a notice of inquiry, to be published in the Federal Register later this summer, which will solicit written comments from the public on a wide range of issues involving AI and copyright. Issues raised during the sessions will directly inform the questions asked in the notice.

If you missed the listening sessions initially, or want to listen back again, you can review all the materials from each of the sessions on our website, including the agenda, transcript, and full video recording.

Follow copyright.gov/ai for updates, events, and to sign up for email notifications, including to learn when the notice of inquiry is published, on our website."

Monday, July 17, 2023

A former judge explains how to fix the Supreme Court’s ethics problem; The Washington Post, July 17, 2023

 , The Washington Post; A former judge explains how to fix the Supreme Court’s ethics problem

"Jeremy Fogel, the executive director of the Berkeley Judicial Institute, was a California state judge before being nominated by President Bill Clinton in 1998 to the U.S. District Court in San Francisco. As a judge, Fogel served for seven years on the judicial committee that reviews judges’ financial disclosure forms, and, from 2011 until 2018, as director of the Federal Judicial Center, where he helped oversee judicial education about ethics and disclosure rules. He spoke last week with columnist Ruth Marcus."...

"Let’s get to the Fogel solution. What is to be done?

I think it’s best if this comes from the court. You avoid a lot of separation-of-powers issues. I think it would be meaningful if the court could, on its own, say, we’re going to adopt the code of conduct, we’re going to do our best to follow that code of conduct. And then just as important as having the code is having some way to get impartial referees in terms of what you do. I think there needs to be some place they can go to get an impartial opinion. And it can’t be somebody who works for them. This is why I like the idea of retired judges. I could put a list of a dozen people together who would be on that list. They all have really deep experience in this area. Then [the justices] can say, “Okay, here’s our code of conduct, which we’re going to endeavor to follow. And in case of a complaint or in real doubts of our own about what to do, we’re going to refer to this group and get their opinion.” It’s not binding, it’s still up to you to do the right thing. But you have somebody to ask, and you’re not just dependent on your own counsel and your own view of what’s okay. It’s not a perfect solution, but I think it could move the ball."

AI learned from their work. Now they want compensation.; The Washington Post, July 16, 2023

 , The Washington Post; AI learned from their work. Now they want compensation.

"Artists say the livelihoods of millions of creative workers are at stake, especially because AI tools are already being used to replace some human-made work. Mass scraping of art, writing and movies from the web for AI training is a practice creators say they never considered or consented to.

But in public appearances and in responses to lawsuits, the AI companies have argued that the use of copyrighted works to train AI falls under fair use — a concept in copyright law that creates an exception if the material is changed in a “transformative” way."

Thousands of authors urge AI companies to stop using work without permission; Morning Edition, NPR, July 17, 2023

 , Morning Edition NPR; Thousands of authors urge AI companies to stop using work without permission

"Thousands of writers including Nora Roberts, Viet Thanh Nguyen, Michael Chabon and Margaret Atwood have signed a letter asking artificial intelligence companies like OpenAI and Meta to stop using their work without permission or compensation."

As pandemic raged, global south lacked vaccines. Never again, researchers vow.; The Washington Post, July 16, 2023

Amy Maxmen , The Washington Post; As pandemic raged, global south lacked vaccines. Never again, researchers vow.

"Once it became clear that wealthy nations would help themselves to coronavirus vaccines long before poorer nations had access, researchers across Africa, Asia and South America banded together with the World Health Organization. Never again, they vowed, would they allow themselves to be at the mercy of the Western world while a deadly pathogen tore through their regions...

Called the mRNA vaccine technology transfer hub, a mouthful meant to reflect their intention to share mRNA technology, the initiative is distinct from the typical, competitive mode of drug development in which companies keep discoveries secret."

Arkansas Supreme Court closing Office of Ethics Counsel; Arkansas Democrat Gazette, July 14, 2023

Will Langhorne, Arkansas Democrat Gazette ; Arkansas Supreme Court closing Office of Ethics Counsel

"The Arkansas Supreme Court on Friday announced plans to close an office it established two years ago to provide attorneys with guidance and responses to ethical questions.

In a succinct order, justices said the high court's Office of Ethics Counsel, which is supported by annual law license fees paid by attorneys, would no longer be funded as of Aug. 1 because of budgetary constraints within the Supreme Courts' Bar of Arkansas account."

Saturday, July 15, 2023

'Not for Machines to Harvest’: Data Revolts Break Out Against A.I.; The New York Times, July 15, 2023

Sheera Frenkel and , The New York Times;  'Not for Machines to Harvest’: Data Revolts Break Out Against A.I.

"At the heart of the rebellions is a newfound understanding that online information — stories, artwork, news articles, message board posts and photos — may have significant untapped value.

The new wave of A.I. — known as “generative A.I.” for the text, images and other content it generates — is built atop complex systems such as large language models, which are capable of producing humanlike prose. These models are trained on hoards of all kinds of data so they can answer people’s questions, mimic writing styles or churn out comedy and poetry...

“What’s happening here is a fundamental realignment of the value of data,” said Brandon Duderstadt, the founder and chief executive of Nomic, an A.I. company...

“The data rebellion that we’re seeing across the country is society’s way of pushing back against this idea that Big Tech is simply entitled to take any and all information from any source whatsoever, and make it their own,” said Ryan Clarkson, the founder of Clarkson...

Eric Goldman, a professor at Santa Clara University School of Law, said the lawsuit’s arguments were expansive and unlikely to be accepted by the court. But the wave of litigation is just beginning, he said, with a “second and third wave” coming that would define A.I.’s future."

Surprise, you just signed a contract! How hidden contracts took over the internet; Planet Money, NPR, July 14, 2023

 , Planet Money, NPR; Surprise, you just signed a contract! How hidden contracts took over the internet

"When you make an account online or install an app, you are probably entering into a legally enforceable contract. Even if you never signed anything. These days, we enter into these contracts so often, it can feel like no big deal."

Friday, July 14, 2023

A Federal Judge Asks: Does the Supreme Court Realize How Bad It Smells?; The New York Times, July 14, 2023

Michael Ponsor, The New York Times ; A Federal Judge Asks: Does the Supreme Court Realize How Bad It Smells?

"All my judicial colleagues, whoever has appointed them, run into situations like these regularly, and I expect they have responded in just the same way. You don’t just stay inside the lines; you stay well inside the lines. This is not a matter of politics or judicial philosophy. It is ethics in the trenches.

The recent descriptions of the behavior of some of our justices and particularly their attempts to defend their conduct have not just raised my eyebrows; they’ve raised the whole top of my head. Lavish, no-cost vacations? Hypertechnical arguments about how a free private airplane flight is a kind of facility? A justice’s spouse prominently involved in advocating on issues before the court without the justice’s recusal? Repeated omissions in mandatory financial disclosure statements brushed under the rug as inadvertent? A justice’s taxpayer-financed staff reportedly helping to promote her books? Private school tuition for a justice’s family member covered by a wealthy benefactor? Wow.

Although the exact numbers fluctuate because of vacancies, the core of our federal judiciary comprises roughly 540 magistrate judges, 670 district judges, 180 appeals court judges and nine Supreme Court justices — fewer than 1,500 men and women in a country of more than 330 million people and 3.8 million square miles. Much depends on this small cohort’s acute sense of smell, its instinctive, uncompromising integrity and its appearance of integrity. If reports are true, some of our justices are, sadly, letting us down."

"Shadow libraries" are at the heart of the mounting copyright lawsuits against OpenAI; Quartz, July 10, 2023

 Michelle Cheng, Quartz; "Shadow libraries" are at the heart of the mounting copyright lawsuits against OpenAI

"However, there are clues about these two data sets. “Books1” is linked to Project Gutenberg (an online e-book library with over 60,000 titles), a popular dataset for AI researchers to train their data on due to the lack of copyright, the filing states. “Books2” is estimated to contain about 294,000 titles, it notes.

Most of the “internet-based books corpora” is likely to come from shadow library websites such as Library Genesis, Z-Library, Sci-Hub, and Bibliotik. The books aggregated by these sites are available in bulk via torrent websites, which are known for hosting copyrighted materials

What exactly are shadow libraries?

Shadow libraries are online databases that provide access to millions of books and articles that are out of print, hard to obtain, and paywalled. Many of these databases, which began appearing online around 2008, originated in Russia, which has a long tradition of sharing forbidden books, according to the magazine Reason.

Soon enough, these libraries became popular with cash-strapped academics around the world thanks to the high cost of accessing scholarly journals—with some reportedly going for as much as $500 for an entirely open-access article.

These shadow libraries are also called “pirate libraries” because they often infringe on copyrighted work and cut into the publishing industry’s profits. A 2017 Nielsen and Digimarc study (pdf) found that pirated books were “depressing legitimate book sales by as much as 14%.”"

Inside the AP’s investigation into the ethics practices of the Supreme Court justices; AP, July 11, 2023

ERIC TUCKER AND BRIAN SLODYSKO, AP ; Inside the AP’s investigation into the ethics practices of the Supreme Court justices

"An Associated Press examination of the ethics practices of the U.S. Supreme Court relied on documents obtained from more than 100 public records requests to public colleges, universities and other institutions that have hosted the justices over the past decade.

Here’s a look at how the reporting was done:...

Some institutions were less forthcoming. The AP went to the Illinois state attorney general to get a binding opinion directing the Chicago Public Library to produce documents related to a visit by Justice Sonia Sotomayor. Other schools, including the University of Arizona, have said their search for records remained ongoing after more than six months.

The AP did pay some schools for documents, including $350 to the University of Utah; $140 to Michigan State University; $159.24 to the University of Minnesota; and roughly $150 to the University of Mississippi.

But some schools responded to records requests with fee demands that the AP deemed unreasonable. The initial fee cited by the University of Georgia for processing two requests was $18,800.50, though it was later reduced after the AP narrowed its request."

Don’t downplay Sonia Sotomayor’s poor conduct. Fix it.; MSNBC, July 13, 2023

 , MSNBC ; Don’t downplay Sonia Sotomayor’s poor conduct. Fix it.

"The response from some liberal commentators has been to downplay the matter. They correctly point out that Sotomayor’s impropriety is minor in comparison to recent ethics scandals involving fellow Justices Clarence Thomas and Samuel Alito, who have been lavished with vacations and gifts from billionaire GOP activists. But a purely comparative lens distracts from the problem. Once again we’re seeing that the Supreme Court has no guardrails against exploitation of power, whether large or small, liberal justice or conservative. And that makes ethics reform at the court even more necessary...

On yet another occasion, an aide said the number of books a library had purchased in advance of an event was “definitely not enough,” prompting library staff members to push back by saying it was a book publisher and bookseller matter...

Sotomayor’s recently revealed conduct isn’t even close to the worst of the things we’ve learned about how justices have inappropriately used their power. But an error is an error. That it’s a liberal Supreme Court justice doesn’t make me more inclined to dismiss it — it makes me less so: I expect more from people whose ideology should make them more vigilant against misuse of power. It’s a reminder that we need rules rather than blind trust to protect the public. The solution here is not to point the finger elsewhere, but to call with even more urgency for Supreme Court ethics reform through Congress."

Kavanaugh calls SCOTUS "government at its finest" amid ethics concerns; Axios, July 13, 2023

 Jacob Knutson, Axios; Kavanaugh calls SCOTUS "government at its finest" amid ethics concerns

"Justice Brett Kavanaugh lauded the operations of the current Supreme Court as "government at its finest" in public remarks on Thursday, despite ethical and political controversies facing the court, the Washington Post reports.

Why it matters: Kavanaugh's remarks at the annual 8th Circuit Judicial Conference in Bloomington, Minnesota, were the first public appearance of a justice since the court ended its latest highly charged session."

Thursday, July 13, 2023

RFK Jr. is building a presidential campaign around conspiracy theories; NPR, July 13, 2023

 , NPR; RFK Jr. is building a presidential campaign around conspiracy theories

"What's not up for debate for scientists, researchers and public health officials is Kennedy's long track record of undermining science and spreading dubious claims.

"He has an enormous platform. He is going to, over the next many months, do a series of town hall meetings where he will continue to put bad information out there that will cause people to make bad decisions for themselves and their families, again putting children at risk and causing children to suffer," Offit said. "Because it's always the most vulnerable among us who suffer our ignorance.""

‘You can do both’: experts seek ‘good AI’ while attempting to avoid the bad; The Guardian, July 7, 2023

. The Guardian ; ‘You can do both’: experts seek ‘good AI’ while attempting to avoid the bad

"“We know how to make AI that people want, but we don’t know how to make AI that people can trust,” said Marcus.

The question of how to imbue AI with human values is sometimes referred to as “the alignment problem”, although it is not a neatly defined computational puzzle that can be resolved and implemented in law. This means that the question of how to regulate AI is a massive, open-ended scientific question – on top of significant commercial, social and political interests that need to be navigated...

“Mass discrimination, the black box problem, data protection violations, large-scale unemployment and environmental harms – these are the actual existential risks,” said Prof Sandra Wachter of the University of Oxford, one of the speakers at the summit. “We need to focus on these issues right now and not get distracted by hypothetical risks. This is a disservice to the people who are already suffering under the impact of AI.”"

Are We Going Too Far By Allowing Generative AI To Control Robots, Worriedly Asks AI Ethics And AI Law; Forbes, July 10, 2023

Dr. Lance B. Eliot , Forbes; Are We Going Too Far By Allowing Generative AI To Control Robots, Worriedly Asks AI Ethics And AI Law

"What amount of due diligence is needed or required on the part of the user when it comes to generative AI and robots?

Nobody can as yet say for sure. Until we end up with legal cases and issues involving presumed harm, this is a gray area. For lawyers that want to get involved in AI and law, these are going to be an exciting and emerging set of legal challenges and legal puzzles that will undoubtedly arise as the use of generative AI becomes further ubiquitous and the advent of robots becomes affordable and practical in our daily lives.

You might also find of interest that some of the AI makers have contractual or licensing clauses that if you are using their generative AI and they get sued for something you did as a result of using their generative AI, you indemnify the AI maker and pledge to pay for their costs and expenses to fight the lawsuit, see my analysis at the link here. This could be daunting for you. Suppose that the house you were cooking in burns to the ground. The insurer sues the AI maker claiming that their generative AI was at fault. But, you agreed whether you know it or not to the indemnification clause, thus the AI maker comes to you and says you need to pay for their defense.

Ouch."

A.I. Could Solve Some of Humanity's Hardest Problems. It Already Has.; The New York Times, July 11, 2023

The Ezra Klein Show, The New York Times; A.I. Could Solve Some of Humanity's Hardest Problems. It Already Has.

"Since the release of ChatGPT, huge amounts of attention and funding have been directed toward chatbots. These A.I. systems are trained on copious amounts of human-generated data and designed to predict the next word in a given sentence. They are hilarious and eerie and at times dangerous.

But what if, instead of building A.I. systems that mimic humans, we built those systems to solve some of the most vexing problems facing humanity?"

Wednesday, July 12, 2023

Does Section 230 cover artificial intelligence? Experts are not sure; ABC7, July 11, 2023

GRAYCE MCCORMICK , ABC7; Does Section 230 cover artificial intelligence? Experts are not sure

"Burk said the decision over whether Section 230 covers generative AI boils down to whether the product is an informational product or a product with manufacturer liability...

It’s hard for us to imagine what social media would be like today without the protection of Section 230, or if it would have even been possible to develop without the risk of lawsuits.

On the other hand, there could be benefits to dictating how it’s developed, considering certain social media platforms’ documented harms.

“If you can identify discrete problems and you have an idea of the outcome that you would like to have or the outcomes you’re worried about, you can actually shape the development of technology into a socially desirable path,” Burk said. He cited products like automobiles and pharmaceuticals, which are now manufactured to be as safe as possible."

Three things to know about how the US Congress might regulate AI; MIT Technology Review, July 3, 2023

, MIT Technology Review ; Three things to know about how the US Congress might regulate AI

"Here are three key themes in all this chatter that you should know to help you understand where US AI legislation could be going."

Google hit with class-action lawsuit over AI data scraping; Reuters, July 11, 2023

 , Reuters ; Google hit with class-action lawsuit over AI data scraping

"Alphabet's Google (GOOGL.O) was accused in a proposed class action lawsuit on Tuesday of misusing vast amounts of personal information and copyrighted material to train its artificial intelligence systems.

The complaint, filed in San Francisco federal court by eight individuals seeking to represent millions of internet users and copyright holders, said Google's unauthorized scraping of data from websites violated their privacy and property rights."

Inside the White-Hot Center of A.I. Doomerism; The New York Times, July 11, 2023

 Kevin Roose, The New York Times; Inside the White-Hot Center of A.I. Doomerism

"But the difference is that Anthropic’s employees aren’t just worried that their app will break, or that users won’t like it. They’re scared — at a deep, existential level — about the very idea of what they’re doing: building powerful A.I. models and releasing them into the hands of people, who might use them to do terrible and destructive things.

Many of them believe that A.I. models are rapidly approaching a level where they might be considered artificial general intelligence, or “A.G.I.,” the industry term for human-level machine intelligence. And they fear that if they’re not carefully controlled, these systems could take over and destroy us...

And lastly, he made a moral case for Anthropic’s decision to create powerful A.I. systems, in the form of a thought experiment.

“Imagine if everyone of good conscience said, ‘I don’t want to be involved in building A.I. systems at all,’” he said. “Then the only people who would be involved would be the people who ignored that dictum — who are just, like, ‘I’m just going to do whatever I want.’ That wouldn’t be good.”"

Tuesday, July 11, 2023

EU AI Act: first regulation on artificial intelligence; European Parliament News, June 14, 2023

European Parliament News ; EU AI Act: first regulation on artificial intelligence

"As part of its digital strategy, the EU wants to regulate artificial intelligence (AI) to ensure better conditions for the development and use of this innovative technology. AI can create many benefits, such as better healthcare; safer and cleaner transport; more efficient manufacturing; and cheaper and more sustainable energy.

In April 2021, the European Commission proposed the first EU regulatory framework for AI. It says that AI systems that can be used in different applications are analysed and classified according to the risk they pose to users. The different risk levels will mean more or less regulation. Once approved, these will be the world’s first rules on AI."

Supreme Court stands by its guidelines after report raises new ethics questions; CNN, July 11, 2011

, CNN; Supreme Court stands by its guidelines after report raises new ethics questions

"Gabe Roth, executive director of a group called Fix the Court, which advocates for more transparency at the court, said in a statement that the rules for Sotomayor appear different than rules that for other branches of government.

“A member of Congress would never be able to use her office to hawk her book,” Roth said. He noted that the AP story, unlike other recent reports focused on Justices Clarence Thomas and Samuel Alito, focuses on a liberal justice demonstrating that the court’s “ethical lapses stretch across partisan lines and that conservative and liberal justices alike need to clean up their act.”"

You can say no to a TSA face scan. But even a senator had trouble.; The Washington Post, July 11, 2023

 , The Washington Post; You can say no to a TSA face scan. But even a senator had trouble.

"Let’s discuss two topics:

  • TSA’s face scanning is supposed to be optional for us. Is it, really?
  • What are the potential benefits and drawbacks of the TSA’s use of facial recognition software?"