Saturday, October 5, 2019

An Open Source License That Requires Users to Do No Harm; Wired, October 4, 2019

Klint Finley, Wired;

An Open Source License That Requires Users to Do No Harm

Open source software can generally be freely copied and reused. One developer wants to impose ethical constraints on the practice.

"Increasingly, some developers are calling on their employers and the government to stop using their work in ways they believe are unethical...

Coraline Ada Ehmke wants to give her fellow developers more control over how their software is used. Software released under her new "Hippocratic License" can be shared and modified for almost any purpose, with one big exception: "Individuals, corporations, governments, or other groups for systems or activities that actively and knowingly endanger, harm, or otherwise threaten the physical, mental, economic, or general well-being of individuals or groups in violation of the United Nations Universal Declaration of Human Rights.”

Defining what it means to do harm is inherently contentious, but Ehmke hopes that tying the license to existing international standards will reduce the uncertainty. The declaration of human rights "is a document that's 70 years old and is pretty well established and accepted for its definition of harm and what violating human rights really means," she says."

Friday, October 4, 2019

Gatekeeping Is Not The Same As Censorship; Forbes, August 22, 2019

Kalev Leetaru, Forbes; Gatekeeping Is Not The Same As Censorship

"With each new effort by social media companies to reign in the deluge of digital falsehoods, accusations pour forth that such efforts represent censorship. In reality, the two represent very different concepts, with censorship referring to the repression of ideas in alignment to political, social or moral views, while gatekeeping in its broadest sense refers to efforts to maintain the quality of information published in a given venue. A censor prohibits discussion of topics with which they disagree. A gatekeeper is viewpoint-neutral, ensuring only that the information has been thoroughly vetted and verified...

In the end, both social platforms and society at large must recognize the clear distinction between the dangers of censorship and the benefits of gatekeeping."

The Authoritarian’s Worst Fear? A Book; The New York Times, October 3, 2019

, The New York Times; The Authoritarian’s Worst Fear? A Book

""Regimes are expending so much energy attacking books because their supposed limitations have begun to look like strengths: With online surveillance, digital reading carries with it great risks and semi-permanent footprints; a physical book, however, cannot monitor what you are reading and when, cannot track which words you mark or highlight, does not secretly scan your face, and cannot know when you are sharing it with others."

Thursday, October 3, 2019

E.U.’s Top Court Rules Against Facebook in Global Takedown Case; The New York Times, October 3, 2019

, The New York Times; E.U.’s Top Court Rules Against Facebook in Global Takedown Case


"The case has been closely watched because of its potential ripple effects for regulating internet content. The enforcement of defamation, libel and privacy laws varies from country to country, with language and behavior that is allowed in one nation prohibited in another. The court’s decision highlights the difficulty of creating uniform standards to govern an inherently borderless web and then enforcing them.

Facebook and other critics have warned that letting a single nation force an internet platform to delete material elsewhere would hurt free expression...

Last week, the European Court of Justice limited the reach of the European privacy law known as the “right to be forgotten,” which allows European citizens to demand Google remove links to sensitive personal data from search results. The court said Google could not be ordered to remove links to websites globally, except in certain circumstances when weighed against the rights to free expression and the public’s right to information."

Wednesday, October 2, 2019

Congress and Trump Agreed They Want a National Privacy Law. It Is Nowhere in Sight.; The New York Times, October 1, 2019

David McCabe, The New York Times;




"But after months of talks, a national privacy law is nowhere in sight...

The struggle to regulate consumer data shows how lawmakers have largely been unable to turn rage at Silicon Valley’s practices into concrete action... 

But the fervor to crack down on Silicon Valley has produced only a single new law, a bill to prevent sex trafficking online...

The United States has some laws that protect consumers’ privacy, like medical information collected by a doctor. But Congress has never set an overarching national standard for how most companies gather and use data. Regulators in Europe, in contrast, put strict new privacy rules into effect last year. 

Many tech companies built lucrative businesses off their users’ personal information, often by offering a “free” product in return.”

The Last Hope for Net Neutrality; Slate, October 1, 2019

April Glaser, Slate; The Last Hope for Net Neutrality

A federal appeals court upheld the FCC’s repeal of the open-internet rules. But it allowed for states to save them. 

 

"It’s confirmed: Net neutrality is legally dead. On Tuesday morning, a federal appeals court reaffirmed the Federal Communications Commission’s repeal of Obama-era net neutrality rules that prohibited internet providers from blocking, slowing down, or speeding up access to websites. In a 200-page decision, the judges on the U.S. Court of Appeals for the D.C. Circuit agreed with FCC Chairman Ajit Pai, who in 2017 vowed to “fire up a weed whacker” and destroy the regulations, which had only been on the books for about two years at the time.

 

While it’s been legal for internet providers to block access to websites since June 2018, when the FCC’s net neutrality repeal hit the books, advocates and website owners who depend on unfettered consumer access to the web were hopeful that the court would invalidate the repeal. Now, internet providers like Comcast, Verizon, and AT&T can do whatever they want with their customers’ connections and web access as long as they state that they reserve the right to do so in their terms of service. That doesn’t mean the internet is going to change tomorrow, or that Comcast will start throttling with abandon anytime soon. But by allowing telecom companies to sell faster speeds to the websites that can afford it, the deregulation threatens the ideal of the open web—a level playing field that allows anyone to build a website that can reach anyone. 

 

There is a significant silver lining in Tuesday’s ruling, however: The court struck down the part of the FCC’s 2017 rules that attempted to preempt state net neutrality rules. That reaffirms legislation and executive orders across the country that seek to preserve the pre-2017 status quo in which companies could not mess with websites’ and customers’ access to the internet. Nine states—Hawaii, Montana, New York, New Jersey, Washington, Rhode Island, California, Montana, and Vermont—have passed their own net neutrality rules. Another 27 states have seen legislation proposed to protect net neutrality. More than 100 mayors of cities across the country likewise have pledged not to sign contracts with internet providers that violate net-neutrality principles."

Tuesday, October 1, 2019

Roboethics: The Human Ethics Applied to Robots; Interesting Engineering, September 22, 2019

, Interesting Engineering; Roboethics: The Human Ethics Applied to Robots 

Who or what is going to be held responsible when or if an autonomous system malfunctions or harms humans?
"On ethics and roboethics 

Ethics is the branch of philosophy which studies human conduct, moral assessments, the concepts of good and evil, right and wrong, justice and injustice. The concept of roboethics brings up a fundamental ethical reflection that is related to particular issues and moral dilemmas generated by the development of robotic applications. 

Roboethics --also called machine ethics-- deals with the code of conduct that robotic designer engineers must implement in the Artificial Intelligence of a robot. Through this kind of artificial ethics, roboticists must guarantee that autonomous systems are going to be able to exhibit ethically acceptable behavior in situations where robots or any other autonomous systems such as autonomous vehicles interact with humans.

Ethical issues are going to continue to be on the rise as long as more advanced robotics come into the picture. In The Ethical Landscape of Robotics (PDF) by Pawel Lichocki et al., published by IEEE Robotics and Automation Magazine, the researchers list various ethical issues emerging in two sets of robotic applications: Service robots and lethal robots."

Metro’s ethics changes are welcome. But they’re only a start.; The Washington Post, September 29, 2019

Editorial Board, The Washington Post; Metro’s ethics changes are welcome. But they’re only a start.

"THE REPUTATION of former Metro chairman Jack Evans wasn’t the only thing that was tarnished amid the swirl of allegations that he used his public office to advance his private interests. Public trust in the Metro board was also badly shaken after it completely botched its handling of the allegations. It’s encouraging, then, that the board has taken a first step in its own rehabilitation by amending its code of ethics.
 
“The reforms will improve transparency, accountability and fairness of all parties,” board chairman Paul C. Smedberg said of revisions to the ethics policy that were approved on Thursday. The changes include a clearer definition of conflicts of interests, putting the transit agency’s inspector general in charge of investigations and opening the process to the public with requirements for written reports and discussions held in public."

A Teenager Killed Himself After Being Outed as Bisexual. His Family Wants Justice.; The New York Times, September 30, 2019

, The New York Times;

A Teenager Killed Himself After Being Outed as Bisexual. His Family Wants Justice.

The family and classmates of Channing Smith, a high school junior, said his death was a result of “social media bullying” and called for a thorough investigation.


"Channing’s death underscores the challenges of combating cyberbullying, which has proliferated in recent years. According to a report last year from the Pew Research Center, 59 percent of teenagers said they had been bullied or harassed online — and many of them thought teachers, social media companies and politicians were failing to help.

In schools across the country, L.G.B.T. students are more likely to be bullied and experience depression than their straight peers, studies have found."

Monday, September 30, 2019

For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection; Undark, September 30, 2019

, Undark; For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection

"Research to capture these snapshots, called genome-wide association studies, can only draw conclusions about the data that’s been collected. Without studies that look at each underrepresented population, genetic tests and therapies can’t be tailored to everyone. Still, projects intended as correctives, like All of Us and the International HapMap Project, face an ethical conundrum: Collecting that data could exploit the very people the programs intend to help."

Wednesday, September 25, 2019

‘Nerd,’ ‘Nonsmoker,’ ‘Wrongdoer’: How Might A.I. Label You?; The New York Times, September 20, 2019

, The New York Times; ‘Nerd,’ ‘Nonsmoker,’ ‘Wrongdoer’: How Might A.I. Label You?

ImageNet Roulette, a digital art project and viral selfie app, exposes how biases have crept into the artificial-intelligence technologies changing our lives.

"But for Mr. Paglen, a larger issue looms. The fundamental truth is that A.I. learns from humans — and humans are biased creatures. “The way we classify images is a product of our worldview,” he said. “Any kind of classification system is always going to reflect the values of the person doing the classifying.”"

Tuesday, September 24, 2019

For authoritarians, stifling the arts is of a piece with demonising minorities; The Guardian, September 22, 2019

Elif Shafak, The Guardian; For authoritarians, stifling the arts is of a piece with demonising minorities

"Art and literature are – and will be increasingly – at the centre of the new culture wars. What might seem like sporadic, disconnected incidents here and there are in truth manifestations of a similar mindset, a growing wave of bigotry...

In country after country, we have seen enough examples to understand that a divisive and aggressive rhetoric against LGBT and women’s rights is intrinsic to the rise of populist nationalism and populist authoritarianism.

Top-down censorship and the control of art and literature are inseparable components of the hatred and discrimination against sexual minorities, as well as against immigrants and intellectuals. Artists and writers cannot afford not to know what is happening to colleagues in other parts of the world. We cannot afford to be silent."

Kona Stories Book Store to Celebrate Banned Book Week; Big Island Now, September 21, 2019

Big Island Now; Kona Stories Book Store to Celebrate Banned Book Week

"At its heart, Banned Books Week is a celebration of the freedom to access ideas, a fundamental right that belongs to everyone and over which no one person or small group of people should hold sway.

Banned Books Week is a celebration of books, comics, plays, art, journalism and much more. 

At Kona Stories Book Store, books have been wrapped in paper bags to disguise the title. Books are decorated with red “I read banned books” stickers and a brief description of why they are on the list.

Customers are encouraged to buy the books without knowing the titles."

Harry Potter and the Poorly-Read Exorcists; The New York Times, September 23, 2019

, The New York Times; Harry Potter and the Poorly-Read Exorcists

"Little surprise, then, that two decades of efforts to protect children from imaginary spells have made no difference at all. Harry Potter titles have sold more the 500 million copies worldwide.

As it happens, this is Banned Books Week in the United States, so the timing of Father Rehill’s ban is richly ironic, but Harry and his friends are no longer the chief targets of book-banning adults, presumably because most adults are now aware that attempting to keep children from reading Harry Potter is about as effective as banning air."


Monday, September 23, 2019

M.I.T. Media Lab, Already Rattled by the Epstein Scandal, Has a New Worry; The New York Times, September 22, 2019

, The New York Times; M.I.T. Media Lab, Already Rattled by the Epstein Scandal, Has a New Worry

"Four researchers who worked on OpenAg said in interviews with The New York Times that Mr. Harper had made exaggerated or false claims about the project to its corporate sponsors, a group that included the retail giant Target, as well as in interviews with the news media."

Manifesto Promotes ‘Ethics, Equity, and Empathy’; STREETSBLOGUSA,September 20, 2019



Manifesto Promotes ‘Ethics, Equity, and Empathy’


A design firm publishes a new credo for engineers, policymakers, and planners.

"Maryland-based design firm is seeking to revolutionize the century-old credo that shapes how policymakers and engineers plan communities — in order to force planners to prioritize human beings over automobiles and think deeply about how their decisions affect road safety. 

Toole Design, which has 17 offices in the United States and Canada, last week released a manifesto that seeks to substitute new concepts for the traditional “three Es” — education, enforcement, and engineering — that have guided transportation professionals as they have built the infrastructure of our towns and cities.

The new “three Es” that Toole proposes — “ethics, equity, and empathy”  — replace the object- and rule-centered approach that dominates the discipline with a moral one centered on people."



Saturday, September 21, 2019

'The Personification Of Human Decency': Nina Totenberg Remembers Cokie Roberts; NPR, September 17, 2019

Nina Totenberg, NPR; 'The Personification Of Human Decency': Nina Totenberg Remembers Cokie Roberts

"To know Cokie was to see the personification of human decency. There is a reason she was asked to speak at so many funerals. People felt such a deep connection to her because she touched their lives. Casual friends would find Cokie visiting them in the hospital. People in terrible financial straits would find her bailing them out, hiring them for work that perhaps she did not need, but work that left them with their dignity...

On a larger scale, she was always the voice of people with less power, and the voice of what is right. I remember one day many years ago, when we were in negotiations with NPR management over a labor contract. Management didn't want to extend health care coverage to one group, and we were at an impasse.

Then Cokie, who was working on a piece of embroidery, looked up at the management team and said, "You know, the position you are taking isn't immoral, it is simply amoral." The room got very quiet, and soon the impasse was over."

Banned Books Week Celebrates 'Freedom To Read'; WESA, Pittsburgh's NPR News Station, September 18, 2019

WESA, Pittsburgh's NPR News Station; Banned Books Week Celebrates 'Freedom To Read'

"Christy Bostardi is a member of Book Fairies of Pennsylvania, which hides books around the city for strangers to find and read. She says banning books also has educational implications. 

"We read books for personal reasons... to explore the outside world and learn." Removing books from shelves, she says, prohibits us from being an informed society, but it can have an inverse effect, especially for children, who get excited by the idea of reading a banned book.

The ACLU is also partnering with CMU and the Carnegie Library system to celebrate the "freedom to read" at the Carnegie Lecture Hall in Oakland on Tuesday."

Public libraries across the country face a different type of censorship; Tennessean, September 20, 2019

Kent Oliver, Tennessean; Public libraries across the country face a different type of censorship

"Book censorship impedes access to literature and information. For a public library such as Nashville Public Library, unfettered, undiscriminating access to reading is the core of our work – it is in our library DNA. Access is the key word. Librarians work to ensure access to ideas, popular and unpopular, within financial constraints.

The disturbing issue confronting us for this year’s Banned Books Week, Sep. 22-28, is the restrictions publishers are placing on public libraries making it more difficult to buy e-books and e-audiobooks. In particular, libraries are concerned about a new e-book embargo from Macmillan, one of the biggest book publishers in the industry, set to launch Nov. 1.

Under this new policy, libraries will be limited to purchasing (closer to leasing, really) one copy of a new e-book for eight weeks after release, when demand is at its peak."
 

Americans’ perceptions about unethical behavior shape how they think about people in powerful roles; Pew Research Center, September 19, 2019

Claire Gecewicz and Lee Rainie, Pew Research Center; Americans’ perceptions about unethical behavior shape how they think about people in powerful roles

"Americans have mixed views of the job performance of those who hold positions of power and responsibility in eight major U.S. groups and institutions. A key element in shaping these views is their sense of whether members of these groups act ethically and hold themselves accountable for their mistakes, according to a new Pew Research Center survey.

The groups studied included members of Congress, local elected officials, leaders of technology companies, journalists, religious leaders, police officers, principals at public K-12 schools and military leaders."

Friday, September 20, 2019

People Are Looking At Your LinkedIn Profile. They Might Be Chinese Spies; NPR, September 19, 2019

Ryan Lucas, NPR; People Are Looking At Your LinkedIn Profile. They Might Be Chinese Spies

"Demers took over leadership of the National Security Division in February 2018 after being confirmed by the Senate. Since taking the helm, he has spent a considerable amount of time on China and what he calls its prolific espionage efforts against the United States.

They're vast in scale, he said, and they span the spectrum from traditional espionage targeting government secrets to economic espionage going after intellectual property and American trade secrets...

It's a play that has also been used to target folks in the business world and academia, where China is hungry for cutting-edge technology and trade secrets. For years, the Chinese intelligence services have hacked into U.S. companies and made off with intellectual property.

Now, U.S. officials say China's spies are increasingly turning to what is known as "nontraditional collectors" — students, researchers and business insiders — to scoop up secrets."

Thursday, September 19, 2019

Time to Build a National Data Broker Registry; The New York Times, September 13, 2019

; Time to Build a National Data Broker Registry

A federal clearing house would help separate the good and bad actors who deal in data.

"It’s time for a national data privacy law, one that gives consumers meaningful rights — to know who has their data, how it is used and how to opt out. It’s in our country’s best interest to have a national standard that, done thoughtfully, benefits both consumers and businesses by providing transparency, uniformity and certainty without deterring innovation and competition...

There is much work to do to ensure the ethical use of information in our economy. With a concerted effort to engage in mutual understanding, we can address consumer privacy on the one hand, while supporting the inventive, valuable and responsible uses of data on the other.

Jordan Abbott is the chief data ethics officer for Acxiom."

AI - ethics within data protections legal framework; Lexology, September 4, 2019


"Privacy and ethics: hand in hand

AI relies on huge volumes of data and often that data contains personal data. But processing such large volumes of data could be at odds with the legal framework contained in the General Data Protection Regulation and the Data Protection Act 2018, which is based on ethical principles. As Elizabeth Denham, the Information Commissioner, stated: "Ethics is at the root of privacy and is the future of data protection".

Recent headlines draw out some of the main ethical and data protection challenges raised by AI. Interesting examples include the babysitter vetting app, Predictim, designed to create a risk rating based on a potential babysitter's social media and web presence (without his or her knowledge). China's social credit scoring system is another example of how significant decisions are being made automatically about people without any human input and without transparency, explanation or recourse."

UVA receives approval to form School of Data Science with $120M gift; WDBJ7, September 19, 2019

WDBJ7; UVA receives approval to form School of Data Science with $120M gift

"Philip E. Bourne, Professor and Data Science Institute Director, was appointed dean of the School of Data Science by Provost Magill immediately after the vote conferred official status upon the school...

“We envision the new School of Data Science at UVA as a ‘school without walls,’” Bourne said. “In its very structure and foundation, we will build collaborative and interdisciplinary opportunities through partnerships and physical spaces for shared research and education programs. The new school will combine a focus on research, education, and service to build bridges across academic, geographic, commercial and cultural boundaries using responsible, open data science.”

The school also will focus on ethics and the practice of responsible data science, building upon the Data Science Institute’s existing Center for Data Ethics and Justice.

“Data science offers incredible, revolutionary opportunities to understand and make an impact on our world and our future,” Bourne said. “Now it is more important than ever that everyone using those skills and tools – from students just beginning to learn statistics and programming, to leaders working at the cutting edge of the field – understands the importance of using data ethically and responsibly, and putting their skills to work to make a positive impact on society and our world.”"

Wednesday, September 18, 2019

University Launches Ethics-Forward Data Science Major; Washington Square News, NYU's Independent Student Newspaper, September 16, 2019

Akiva Thalheim, Washington Square News, NYU's Independent Student Newspaper; University Launches Ethics-Forward Data Science Major

"The new major seeks to specifically address and prevent these issues through a required course in the ethics of data science, [Center for Data Science Director Julia] Kempe explained. She added that the course was developed with the assistance of a National Science Foundation grant.

“We are hoping to educate young people to be data savvy and also data critical, because nowadays, everything is about data but often it’s done in a very uncritical way,” Kempe said. “We have to understand where the biases are [and] how to use data ethically — it’s something that we want to impart on every student, if we can.""

Tuesday, September 17, 2019

There’s a reason we don’t know much about AI; Politico, September 16, 2019

Arthur Allen, Politico; There’s a reason we don’t know much about AI

The U.S. government used to study new technologies to flag ethical or social problems. Not anymore. 

"Though many university professors and tech-funded think tanks are examining the ethical, social and legal implications of technologies like Big Data and machine learning, “it’s definitely happening outside the policy infrastructure,” said John Wilbanks, a philosopher and technologist at the Sage Bionetworks research group.

Yet these technologies could have profound effects on our future, and they pose enormous questions for society...

As it happens, there is a good precedent for the federal government stepping up to examine the ethical and legal issues around an important new technology. Starting in 1990, the National Institutes of Health set aside 5 percent of the funding for its Human Genome Project for a program known as ELSI—which stood for the ethical, legal and social implications of genetics research.

The ELSI program, which started 30 years ago, “was a symbol that NIH thought the ethical issues were so important in genomics that they’d spend a lot of money on them,” says Isaac Kohane, chief of the Harvard Medical School’s Department of Biomedical Informatics. “It gave other genetics researchers a heads-up—police your ethics, we care about them.”

ELSI’s premise was to have smart social scientists weigh the pros and cons of genetic technology before they emerged, instead of, “Oops, we let the genie out of the bottle,” said Larry Brody, director of Genomics and Society program at the National Human Genome Research Institute."

Real-Time Surveillance Will Test the British Tolerance for Cameras; The New York Times, September 15, 2019

, The New York Times; Real-Time Surveillance Will Test the British Tolerance for Cameras

Facial recognition technology is drawing scrutiny in a country more accustomed to surveillance than any other Western democracy. 

"“Technology is driving forward, and legislation and regulation follows ever so slowly behind,” said Tony Porter, Britain’s surveillance camera commissioner, who oversees compliance with the country’s surveillance camera code of practice. “It would be wrong for me to suggest the balance is right.”

Britain’s experience mirrors debates about the technology in the United States and elsewhere in Europe. Critics say the technology is an intrusion of privacy, akin to constant identification checks of an unsuspecting public, and has questionable accuracy, particularly at identifying people who aren’t white men."

Steal This Book? There’s a Price; The New York Times, September 15, 2019

, The New York Times; Steal This Book? There’s a Price

I have about 400 offers to buy illegal copies of my own work. Something is very wrong. 

"Maybe, though, it’s too narrow to focus on the way our society has discounted its authors. No doubt musicians, and local retailers, and hometown newspapers, and schoolteachers, and factory workers all feel discounted in much the same way. We have surrendered our lives to technocrat billionaires who once upon a time set out to do no harm and have instead ended up destroying the world as we knew it. Convenience and the lowest possible price, or no price at all, have become our defining values. We have severed ourselves from our communities and from the mutual give-and-take that was once our ordinary daily life. Now we sit alone in our rooms, restlessly scrolling for something free to read."

AI and ethics: The debate that needs to be had; ZDNet, September 16, 2019

, ZDNet; AI and ethics: The debate that needs to be had

Like anything, frameworks and boundaries need to be set -- and artificial intelligence should be no different.

"Building ethical AI with diversity

Part of the solution to help overcome these systemic biases that are built into existing AI systems, according to Lazar, is to have open conversations about ethics -- with input from diverse views in terms of culture, gender, age, and socio-economic background -- and how it could be applied to AI.

"What we need to do is figure out how to develop systems that incorporate democratic values and we need to start the discussion within Australian society about what we want those values to be," he said."

TikTok’s Beijing roots fuel censorship suspicion as it builds a huge U.S. audience; The Washington Post, September 15, 2019

Drew Harwell and Tony Romm, The Washington Post; TikTok’s Beijing roots fuel censorship suspicion as it builds a huge U.S. audience

"TikTok’s surging popularity spotlights the tension between the Web’s global powers: the United States, where free speech and competing ideologies are held as (sometimes messy) societal bedrocks, and China, where political criticism is forbidden as troublemaking."

Artificial intelligence in medicine raises legal and ethical concerns; The Conversation, September 4, 2019

, The Conversation; Artificial intelligence in medicine raises legal and ethical concerns

"The use of artificial intelligence in medicine is generating great excitement and hope for treatment advances.

AI generally refers to computers’ ability to mimic human intelligence and to learn. For example, by using machine learning, scientists are working to develop algorithms that will help them make decisions about cancer treatment. They hope that computers will be able to analyze radiological images and discern which cancerous tumors will respond well to chemotherapy and which will not.

But AI in medicine also raises significant legal and ethical challenges. Several of these are concerns about privacy, discrimination, psychological harm and the physician-patient relationship. In a forthcoming article, I argue that policymakers should establish a number of safeguards around AI, much as they did when genetic testing became commonplace."