Monday, September 30, 2019

For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection; Undark, September 30, 2019

, Undark; For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection

"Research to capture these snapshots, called genome-wide association studies, can only draw conclusions about the data that’s been collected. Without studies that look at each underrepresented population, genetic tests and therapies can’t be tailored to everyone. Still, projects intended as correctives, like All of Us and the International HapMap Project, face an ethical conundrum: Collecting that data could exploit the very people the programs intend to help."

Wednesday, September 25, 2019

‘Nerd,’ ‘Nonsmoker,’ ‘Wrongdoer’: How Might A.I. Label You?; The New York Times, September 20, 2019

, The New York Times; ‘Nerd,’ ‘Nonsmoker,’ ‘Wrongdoer’: How Might A.I. Label You?

ImageNet Roulette, a digital art project and viral selfie app, exposes how biases have crept into the artificial-intelligence technologies changing our lives.

"But for Mr. Paglen, a larger issue looms. The fundamental truth is that A.I. learns from humans — and humans are biased creatures. “The way we classify images is a product of our worldview,” he said. “Any kind of classification system is always going to reflect the values of the person doing the classifying.”"

Tuesday, September 24, 2019

For authoritarians, stifling the arts is of a piece with demonising minorities; The Guardian, September 22, 2019

Elif Shafak, The Guardian; For authoritarians, stifling the arts is of a piece with demonising minorities

"Art and literature are – and will be increasingly – at the centre of the new culture wars. What might seem like sporadic, disconnected incidents here and there are in truth manifestations of a similar mindset, a growing wave of bigotry...

In country after country, we have seen enough examples to understand that a divisive and aggressive rhetoric against LGBT and women’s rights is intrinsic to the rise of populist nationalism and populist authoritarianism.

Top-down censorship and the control of art and literature are inseparable components of the hatred and discrimination against sexual minorities, as well as against immigrants and intellectuals. Artists and writers cannot afford not to know what is happening to colleagues in other parts of the world. We cannot afford to be silent."

Kona Stories Book Store to Celebrate Banned Book Week; Big Island Now, September 21, 2019

Big Island Now; Kona Stories Book Store to Celebrate Banned Book Week

"At its heart, Banned Books Week is a celebration of the freedom to access ideas, a fundamental right that belongs to everyone and over which no one person or small group of people should hold sway.

Banned Books Week is a celebration of books, comics, plays, art, journalism and much more. 

At Kona Stories Book Store, books have been wrapped in paper bags to disguise the title. Books are decorated with red “I read banned books” stickers and a brief description of why they are on the list.

Customers are encouraged to buy the books without knowing the titles."

Harry Potter and the Poorly-Read Exorcists; The New York Times, September 23, 2019

, The New York Times; Harry Potter and the Poorly-Read Exorcists

"Little surprise, then, that two decades of efforts to protect children from imaginary spells have made no difference at all. Harry Potter titles have sold more the 500 million copies worldwide.

As it happens, this is Banned Books Week in the United States, so the timing of Father Rehill’s ban is richly ironic, but Harry and his friends are no longer the chief targets of book-banning adults, presumably because most adults are now aware that attempting to keep children from reading Harry Potter is about as effective as banning air."


Monday, September 23, 2019

M.I.T. Media Lab, Already Rattled by the Epstein Scandal, Has a New Worry; The New York Times, September 22, 2019

, The New York Times; M.I.T. Media Lab, Already Rattled by the Epstein Scandal, Has a New Worry

"Four researchers who worked on OpenAg said in interviews with The New York Times that Mr. Harper had made exaggerated or false claims about the project to its corporate sponsors, a group that included the retail giant Target, as well as in interviews with the news media."

Manifesto Promotes ‘Ethics, Equity, and Empathy’; STREETSBLOGUSA,September 20, 2019



Manifesto Promotes ‘Ethics, Equity, and Empathy’


A design firm publishes a new credo for engineers, policymakers, and planners.

"Maryland-based design firm is seeking to revolutionize the century-old credo that shapes how policymakers and engineers plan communities — in order to force planners to prioritize human beings over automobiles and think deeply about how their decisions affect road safety. 

Toole Design, which has 17 offices in the United States and Canada, last week released a manifesto that seeks to substitute new concepts for the traditional “three Es” — education, enforcement, and engineering — that have guided transportation professionals as they have built the infrastructure of our towns and cities.

The new “three Es” that Toole proposes — “ethics, equity, and empathy”  — replace the object- and rule-centered approach that dominates the discipline with a moral one centered on people."



Saturday, September 21, 2019

'The Personification Of Human Decency': Nina Totenberg Remembers Cokie Roberts; NPR, September 17, 2019

Nina Totenberg, NPR; 'The Personification Of Human Decency': Nina Totenberg Remembers Cokie Roberts

"To know Cokie was to see the personification of human decency. There is a reason she was asked to speak at so many funerals. People felt such a deep connection to her because she touched their lives. Casual friends would find Cokie visiting them in the hospital. People in terrible financial straits would find her bailing them out, hiring them for work that perhaps she did not need, but work that left them with their dignity...

On a larger scale, she was always the voice of people with less power, and the voice of what is right. I remember one day many years ago, when we were in negotiations with NPR management over a labor contract. Management didn't want to extend health care coverage to one group, and we were at an impasse.

Then Cokie, who was working on a piece of embroidery, looked up at the management team and said, "You know, the position you are taking isn't immoral, it is simply amoral." The room got very quiet, and soon the impasse was over."

Banned Books Week Celebrates 'Freedom To Read'; WESA, Pittsburgh's NPR News Station, September 18, 2019

WESA, Pittsburgh's NPR News Station; Banned Books Week Celebrates 'Freedom To Read'

"Christy Bostardi is a member of Book Fairies of Pennsylvania, which hides books around the city for strangers to find and read. She says banning books also has educational implications. 

"We read books for personal reasons... to explore the outside world and learn." Removing books from shelves, she says, prohibits us from being an informed society, but it can have an inverse effect, especially for children, who get excited by the idea of reading a banned book.

The ACLU is also partnering with CMU and the Carnegie Library system to celebrate the "freedom to read" at the Carnegie Lecture Hall in Oakland on Tuesday."

Public libraries across the country face a different type of censorship; Tennessean, September 20, 2019

Kent Oliver, Tennessean; Public libraries across the country face a different type of censorship

"Book censorship impedes access to literature and information. For a public library such as Nashville Public Library, unfettered, undiscriminating access to reading is the core of our work – it is in our library DNA. Access is the key word. Librarians work to ensure access to ideas, popular and unpopular, within financial constraints.

The disturbing issue confronting us for this year’s Banned Books Week, Sep. 22-28, is the restrictions publishers are placing on public libraries making it more difficult to buy e-books and e-audiobooks. In particular, libraries are concerned about a new e-book embargo from Macmillan, one of the biggest book publishers in the industry, set to launch Nov. 1.

Under this new policy, libraries will be limited to purchasing (closer to leasing, really) one copy of a new e-book for eight weeks after release, when demand is at its peak."
 

Americans’ perceptions about unethical behavior shape how they think about people in powerful roles; Pew Research Center, September 19, 2019

Claire Gecewicz and Lee Rainie, Pew Research Center; Americans’ perceptions about unethical behavior shape how they think about people in powerful roles

"Americans have mixed views of the job performance of those who hold positions of power and responsibility in eight major U.S. groups and institutions. A key element in shaping these views is their sense of whether members of these groups act ethically and hold themselves accountable for their mistakes, according to a new Pew Research Center survey.

The groups studied included members of Congress, local elected officials, leaders of technology companies, journalists, religious leaders, police officers, principals at public K-12 schools and military leaders."

Friday, September 20, 2019

People Are Looking At Your LinkedIn Profile. They Might Be Chinese Spies; NPR, September 19, 2019

Ryan Lucas, NPR; People Are Looking At Your LinkedIn Profile. They Might Be Chinese Spies

"Demers took over leadership of the National Security Division in February 2018 after being confirmed by the Senate. Since taking the helm, he has spent a considerable amount of time on China and what he calls its prolific espionage efforts against the United States.

They're vast in scale, he said, and they span the spectrum from traditional espionage targeting government secrets to economic espionage going after intellectual property and American trade secrets...

It's a play that has also been used to target folks in the business world and academia, where China is hungry for cutting-edge technology and trade secrets. For years, the Chinese intelligence services have hacked into U.S. companies and made off with intellectual property.

Now, U.S. officials say China's spies are increasingly turning to what is known as "nontraditional collectors" — students, researchers and business insiders — to scoop up secrets."

Thursday, September 19, 2019

Time to Build a National Data Broker Registry; The New York Times, September 13, 2019

; Time to Build a National Data Broker Registry

A federal clearing house would help separate the good and bad actors who deal in data.

"It’s time for a national data privacy law, one that gives consumers meaningful rights — to know who has their data, how it is used and how to opt out. It’s in our country’s best interest to have a national standard that, done thoughtfully, benefits both consumers and businesses by providing transparency, uniformity and certainty without deterring innovation and competition...

There is much work to do to ensure the ethical use of information in our economy. With a concerted effort to engage in mutual understanding, we can address consumer privacy on the one hand, while supporting the inventive, valuable and responsible uses of data on the other.

Jordan Abbott is the chief data ethics officer for Acxiom."

AI - ethics within data protections legal framework; Lexology, September 4, 2019


"Privacy and ethics: hand in hand

AI relies on huge volumes of data and often that data contains personal data. But processing such large volumes of data could be at odds with the legal framework contained in the General Data Protection Regulation and the Data Protection Act 2018, which is based on ethical principles. As Elizabeth Denham, the Information Commissioner, stated: "Ethics is at the root of privacy and is the future of data protection".

Recent headlines draw out some of the main ethical and data protection challenges raised by AI. Interesting examples include the babysitter vetting app, Predictim, designed to create a risk rating based on a potential babysitter's social media and web presence (without his or her knowledge). China's social credit scoring system is another example of how significant decisions are being made automatically about people without any human input and without transparency, explanation or recourse."

UVA receives approval to form School of Data Science with $120M gift; WDBJ7, September 19, 2019

WDBJ7; UVA receives approval to form School of Data Science with $120M gift

"Philip E. Bourne, Professor and Data Science Institute Director, was appointed dean of the School of Data Science by Provost Magill immediately after the vote conferred official status upon the school...

“We envision the new School of Data Science at UVA as a ‘school without walls,’” Bourne said. “In its very structure and foundation, we will build collaborative and interdisciplinary opportunities through partnerships and physical spaces for shared research and education programs. The new school will combine a focus on research, education, and service to build bridges across academic, geographic, commercial and cultural boundaries using responsible, open data science.”

The school also will focus on ethics and the practice of responsible data science, building upon the Data Science Institute’s existing Center for Data Ethics and Justice.

“Data science offers incredible, revolutionary opportunities to understand and make an impact on our world and our future,” Bourne said. “Now it is more important than ever that everyone using those skills and tools – from students just beginning to learn statistics and programming, to leaders working at the cutting edge of the field – understands the importance of using data ethically and responsibly, and putting their skills to work to make a positive impact on society and our world.”"

Wednesday, September 18, 2019

University Launches Ethics-Forward Data Science Major; Washington Square News, NYU's Independent Student Newspaper, September 16, 2019

Akiva Thalheim, Washington Square News, NYU's Independent Student Newspaper; University Launches Ethics-Forward Data Science Major

"The new major seeks to specifically address and prevent these issues through a required course in the ethics of data science, [Center for Data Science Director Julia] Kempe explained. She added that the course was developed with the assistance of a National Science Foundation grant.

“We are hoping to educate young people to be data savvy and also data critical, because nowadays, everything is about data but often it’s done in a very uncritical way,” Kempe said. “We have to understand where the biases are [and] how to use data ethically — it’s something that we want to impart on every student, if we can.""

Tuesday, September 17, 2019

There’s a reason we don’t know much about AI; Politico, September 16, 2019

Arthur Allen, Politico; There’s a reason we don’t know much about AI

The U.S. government used to study new technologies to flag ethical or social problems. Not anymore. 

"Though many university professors and tech-funded think tanks are examining the ethical, social and legal implications of technologies like Big Data and machine learning, “it’s definitely happening outside the policy infrastructure,” said John Wilbanks, a philosopher and technologist at the Sage Bionetworks research group.

Yet these technologies could have profound effects on our future, and they pose enormous questions for society...

As it happens, there is a good precedent for the federal government stepping up to examine the ethical and legal issues around an important new technology. Starting in 1990, the National Institutes of Health set aside 5 percent of the funding for its Human Genome Project for a program known as ELSI—which stood for the ethical, legal and social implications of genetics research.

The ELSI program, which started 30 years ago, “was a symbol that NIH thought the ethical issues were so important in genomics that they’d spend a lot of money on them,” says Isaac Kohane, chief of the Harvard Medical School’s Department of Biomedical Informatics. “It gave other genetics researchers a heads-up—police your ethics, we care about them.”

ELSI’s premise was to have smart social scientists weigh the pros and cons of genetic technology before they emerged, instead of, “Oops, we let the genie out of the bottle,” said Larry Brody, director of Genomics and Society program at the National Human Genome Research Institute."

Real-Time Surveillance Will Test the British Tolerance for Cameras; The New York Times, September 15, 2019

, The New York Times; Real-Time Surveillance Will Test the British Tolerance for Cameras

Facial recognition technology is drawing scrutiny in a country more accustomed to surveillance than any other Western democracy. 

"“Technology is driving forward, and legislation and regulation follows ever so slowly behind,” said Tony Porter, Britain’s surveillance camera commissioner, who oversees compliance with the country’s surveillance camera code of practice. “It would be wrong for me to suggest the balance is right.”

Britain’s experience mirrors debates about the technology in the United States and elsewhere in Europe. Critics say the technology is an intrusion of privacy, akin to constant identification checks of an unsuspecting public, and has questionable accuracy, particularly at identifying people who aren’t white men."

Steal This Book? There’s a Price; The New York Times, September 15, 2019

, The New York Times; Steal This Book? There’s a Price

I have about 400 offers to buy illegal copies of my own work. Something is very wrong. 

"Maybe, though, it’s too narrow to focus on the way our society has discounted its authors. No doubt musicians, and local retailers, and hometown newspapers, and schoolteachers, and factory workers all feel discounted in much the same way. We have surrendered our lives to technocrat billionaires who once upon a time set out to do no harm and have instead ended up destroying the world as we knew it. Convenience and the lowest possible price, or no price at all, have become our defining values. We have severed ourselves from our communities and from the mutual give-and-take that was once our ordinary daily life. Now we sit alone in our rooms, restlessly scrolling for something free to read."

AI and ethics: The debate that needs to be had; ZDNet, September 16, 2019

, ZDNet; AI and ethics: The debate that needs to be had

Like anything, frameworks and boundaries need to be set -- and artificial intelligence should be no different.

"Building ethical AI with diversity

Part of the solution to help overcome these systemic biases that are built into existing AI systems, according to Lazar, is to have open conversations about ethics -- with input from diverse views in terms of culture, gender, age, and socio-economic background -- and how it could be applied to AI.

"What we need to do is figure out how to develop systems that incorporate democratic values and we need to start the discussion within Australian society about what we want those values to be," he said."

TikTok’s Beijing roots fuel censorship suspicion as it builds a huge U.S. audience; The Washington Post, September 15, 2019

Drew Harwell and Tony Romm, The Washington Post; TikTok’s Beijing roots fuel censorship suspicion as it builds a huge U.S. audience

"TikTok’s surging popularity spotlights the tension between the Web’s global powers: the United States, where free speech and competing ideologies are held as (sometimes messy) societal bedrocks, and China, where political criticism is forbidden as troublemaking."

Artificial intelligence in medicine raises legal and ethical concerns; The Conversation, September 4, 2019

, The Conversation; Artificial intelligence in medicine raises legal and ethical concerns

"The use of artificial intelligence in medicine is generating great excitement and hope for treatment advances.

AI generally refers to computers’ ability to mimic human intelligence and to learn. For example, by using machine learning, scientists are working to develop algorithms that will help them make decisions about cancer treatment. They hope that computers will be able to analyze radiological images and discern which cancerous tumors will respond well to chemotherapy and which will not.

But AI in medicine also raises significant legal and ethical challenges. Several of these are concerns about privacy, discrimination, psychological harm and the physician-patient relationship. In a forthcoming article, I argue that policymakers should establish a number of safeguards around AI, much as they did when genetic testing became commonplace."

Monday, September 16, 2019

Maths and tech specialists need Hippocratic oath, says academic; The Guardian, August 16, 2019

Ian Sample, The Guardian; Maths and tech specialists need Hippocratic oath, says academic

"“We need a Hippocratic oath in the same way it exists for medicine,” Fry said. “In medicine, you learn about ethics from day one. In mathematics, it’s a bolt-on at best. It has to be there from day one and at the forefront of your mind in every step you take.”...

The genetics testing firm 23andMe was a case in point, she said.

“We literally hand over our most private data, our DNA, but we’re not just consenting for ourselves, we are consenting for our children, and our children’s children. Maybe we don’t live in a world where people are genetically discriminated against now, but who’s to say in 100 years that we won’t? And we are are paying to add our DNA to that dataset.”"

Sunday, September 15, 2019

Northeastern researchers team up with Accenture to offer a road map for artificial intelligence ethics oversight; Northeastern University, August 29, 2019

Khalida Sarwari, Northeastern University; Northeastern researchers team up with Accenture to offer a road map for artificial intelligence ethics oversight

"Now, Northeastern professors John Basl and Ron Sandler are offering organizations guidance for how to create a well-designed and effective committee based on similar models used in biomedical research. 

Maintaining that an ethics committee that is adequately resourced and thoughtfully designed can play an important role in mitigating digital risks and maintaining trust between an organization and the public, the researchers provide a framework for such a system in a new report produced in collaboration with global professional services company Accenture...

“If you want to build a committee that works effectively and if you really want to build ethical capacity within an organization, it’s a significant undertaking where you can’t just throw together a few people with ethical expertise,” says Sandler.

Added Basl: “We lay out the kinds of experts an organization will need—someone who knows local laws, someone who knows ethics, a variety of technical experts, and members of an affected community. Who those individuals are, or what their particular expertise is, depends on the kind of technology being developed and deployed.”"

Building data and AI ethics committees; Accenture.com, August 20, 2019

Accenture.com; Building data and AI ethics committees

"In brief
  • Organizations face a difficult challenge when it comes to ethically-informed data collection, sharing and use.
  • There is growing demand for incorporating ethical considerations into products and services involving big data, AI and machine learning.
  • Outside of mere legal compliance, there is little guidance on how to incorporate this ethical consideration.
  • To fill this gap, Northeastern University and Accenture explore the development of effective and well-functioning data and AI ethics committees."

Saturday, September 14, 2019

How to Build an AI Ethics Committee; The Wall Street Journal, August 30, 2019

Jared Council, The Wall Street Journal; How to Build an AI Ethics Committee

New road map provides guidelines for starting an ethics committee for artificial-intelligence and data concerns


"A new guidebook aims to help organizations set up data and artificial-intelligence ethics committees and better deal with the ethical issues associated with the technology."

CS department hires Ethics TAs; The Brown Daily Herald, September 5, 2019

Sarah Wang, The Brown Daily Herald;

CS department hires Ethics TAs

Newly-hired teaching assistants to integrate ethics into computer science 

 

"Last spring, the Department of Computer Science announced the inaugural hiring of 10 Ethics Teaching Assistants, who will develop and deliver curricula around ethics and society in five of the department’s largest courses. 

The department created the ETA program to acknowledge the impact that the products and services created by computer scientists have on society, said Professor of Computer Science and Department Chair Ugur Cetintemel. Cetintemel, who helped spearhead the program, said that it was important for concentrators to think critically about the usage and possible misuse of the solutions they build. “We want our concentrators to think about the ethical and societal implications of what they do, not as an afterthought but as another fundamental dimension they should consider as they develop their work,” he said...

The CS department is already incoorporating ethics into its curriculum through multiple courses such as CSCI 1951I: “CS for Social Change,” but the department hopes ETAs will encourage students to view the topic as a more fundamental aspect of CS."


, Daily Nous; Philosophers Win €17.9 Million Grant for Study of the Ethics of Disruptive Technologies


"A project on the ethics of socially disruptive technologies, led by Philip Brey, professor of philosophy of technology at the Department of Philosophy at the University of Twente and scientific director of the 4TU.Centre for Ethics and Technology, has received a €17.9 million (approximately $19.6 million) grant from the Dutch Ministry of Education, Culture and Science’s Gravitation program.

The other researchers involved in the 10-year project are Ibo van de Poel (TU Delft), Ingrid Robeyns (Utrecht University), Sabine Roeser (TU Delft), Peter-Paul Verbeek (University of Twente), and Wijnand IJsselsteijn (TU Eindhoven).

Socially disruptive technologies include artificial intelligence, robotics, nanomedicine, molecular biology, neurotechnology, and climate technology, among other things. A press release from the University of Twente describes what the researchers will be working on:

They will be developing new methods needed not only to better understand the development and implementation of the new generation of disruptive technologies, but also to evaluate them from a moral perspective and to intervene in the way technology continues to develop. This includes the development of an approach to ethical and philosophical aspects of a disruptive technology that is widely applicable. Another important aspect of the program is the cooperation between ethicists, philosophers and technical scientists aimed at finding better methods for responsible and sustainable innovation. One objective of the programme is to innovate ethics and philosophy in the broadest sense by researching how classical ethical values and philosophical concepts are being challenged by modern technology."

Orwellabama? Crimson Tide Track Locations to Keep Students at Games; The New York Times, September 12, 2019

, The New York Times; Orwellabama? Crimson Tide Track Locations to Keep Students at Games

Coach Nick Saban gets peeved at students leaving routs early. An app ties sticking around to playoff tickets, but also prompts concern from students and privacy watchdogs.

"Greg Byrne, Alabama’s athletic director, said privacy concerns rarely came up when the program was being discussed with other departments and student groups. Students who download the Tide Loyalty Points app will be tracked only inside the stadium, he said, and they can close the app — or delete it — once they leave the stadium. “If anybody has a phone, unless you’re in airplane mode or have it off, the cellular companies know where you are,” he said.

Thursday, September 12, 2019

The misinformation age; Axios, September 12, 2019

Scott Rosenberg, David Nather, Axios; The misinformation age


"Hostile powers undermining elections. Deepfake video and audio. Bots and trolls, phishing and fake news — plus of course old-fashioned spin and lies. 

Why it matters: The sheer volume of assaults on fact and truth is undermining trust not just in politics and government, but also in business, tech, science and health care as well.
  • Beginning with this article, Axios is launching a series to help you navigate this new avalanche of misinformation, and illuminate its impact on America and the globe, through 2020 and beyond.
Our culture now broadly distrusts most claims to truth. Majorities of Americans say they've lost trust in the federal government and each other — and think that lack of trust gets in the way of solving problems, according to a Pew Research Center survey."

'Ethics slam' packs pizzeria; The Herald Journal, September 10, 2019

 Ashtyn Asay, The Herald Journal; 'Ethics slam' packs pizzeria

"The ethics slam was sponsored by the Weber State University Richard Richards Institute for Ethics, the USU Philosophy Club, and the Society for Women in Philosophy. It was organized by Robison-Greene and her husband, Richard Greene, a professor of philosophy and director of the Richard Richards Institute for Ethics.

This is the seventh ethics slam put together by Greene and Robison-Greene, whose collective goal is to encourage civil discourse and generate rich conversations within a respectful community.

This goal appeared to be met on Monday evening, as ethics slam participants engaged in polite conversation and debate for almost two hours. Opinions were challenged and controversial points were made, but Greene and Robison-Greene kept the conversation on track...

"The next ethics slam will be at 7 p.m. Sept. 23 at the Pleasant Valley Branch of the Weber County Library. The topic of discussion will be: “Is censorship ever appropriate?”"

Māori anger as Air New Zealand seeks to trademark 'Kia Ora' logo; The Guardian, September 12, 2019

Eleanor Ainge Roy, The Guardian; Māori anger as Air New Zealand seeks to trademark 'Kia Ora' logo

"New Zealand’s national carrier, Air New Zealand, has offended the country’s Māori people by attempting to trademark an image of the words “kia ora”; the greeting for hello."

Wednesday, September 11, 2019

How an Élite University Research Center Concealed Its Relationship with Jeffrey Epstein; The New Yorker, September 6, 2019

Ronan Farrow, The New Yorker;

How an Élite University Research Center Concealed Its Relationship with Jeffrey Epstein

New documents show that the M.I.T. Media Lab was aware of Epstein’s status as a convicted sex offender, and that Epstein directed contributions to the lab far exceeding the amounts M.I.T. has publicly admitted.

 

"Current and former faculty and staff of the media lab described a pattern of concealing Epstein’s involvement with the institution. Signe Swenson, a former development associate and alumni coordinator at the lab, told me that she resigned in 2016 in part because of her discomfort about the lab’s work with Epstein. She said that the lab’s leadership made it explicit, even in her earliest conversations with them, that Epstein’s donations had to be kept secret...

Swenson said that, even though she resigned over the lab’s relationship with Epstein, her participation in what she took to be a coverup of his contributions has weighed heavily on her since. Her feelings of guilt were revived when she learned of recent statements from Ito and M.I.T. leadership that she believed to be lies. “I was a participant in covering up for Epstein in 2014,” she told me. “Listening to what comments are coming out of the lab or M.I.T. about the relationship—I just see exactly the same thing happening again.”"

He Who Must Not Be Tolerated; The New York Times, September 8, 2019

Kara Swisher, The New York Times;

He Who Must Not Be Tolerated

Joi Ito’s fall from grace for his relationship with Jeffrey Epstein was much deserved. But his style of corner-cutting ethics is all too common in tech. 

"Voldemort? 

Of all the terrible details of the gross fraud that the former head of the M.I.T. Media Lab, Joichi Ito, and his minions perpetrated in trying to cover up donations by Jeffrey Epstein to the high-profile tech research lab, perhaps giving a pedophile a nickname of a character in a book aimed at children was the most awful.

“The effort to conceal the lab’s contact with Epstein was so widely known that some staff in the office of the lab’s director, Joi Ito, referred to Epstein as Voldemort or ‘he who must not be named,’ ” wrote Ronan Farrow in The New Yorker, in his eviscerating account of the moral and leadership failings of one of the digital industry’s top figures."

The Moral Rot of the MIT Media Lab; Slate, September 8, 2019

Justin Peters, Slate; The Moral Rot of the MIT Media Lab

"Over the course of the past century, MIT became one of the best brands in the world, a name that confers instant credibility and stature on all who are associated with it. Rather than protect the inherent specialness of this brand, the Media Lab soiled it again and again by selling its prestige to banks, drug companies, petroleum companies, carmakers, multinational retailers, at least one serial sexual predator, and others who hoped to camouflage their avarice with the sheen of innovation. There is a big difference between taking money from someone like Epstein and taking it from Nike or the Department of Defense, but the latter choices pave the way for the former."

Thursday, September 5, 2019

AI Ethics Guidelines Every CIO Should Read; Information Week, August 7, 2019

John McClurg, Information Week; AI Ethics Guidelines Every CIO Should Read

"Technology experts predict the rate of adoption of artificial intelligence and machine learning will skyrocket in the next two years. These advanced technologies will spark unprecedented business gains, but along the way enterprise leaders will be called to quickly grapple with a smorgasbord of new ethical dilemmas. These include everything from AI algorithmic bias and data privacy issues to public safety concerns from autonomous machines running on AI.

Because AI technology and use cases are changing so rapidly, chief information officers and other executives are going to find it difficult to keep ahead of these ethical concerns without a roadmap. To guide both deep thinking and rapid decision-making about emerging AI technologies, organizations should consider developing an internal AI ethics framework."

Does the data industry need a code of ethics?; The Scotsman, August 29, 2019

David Lee, The Scotsman; Does the data industry need a code of ethics?

"Docherty says the whole area of data ethics is still emerging: “It’s where all the hype is now – it used to be big data that everyone talked about, now it’s data ethics. It’s fundamental, and embedding it across an organisation will give competitive advantage.”

So what is The Data Lab, set up in 2015, doing itself in this ethical space? “We’re ensuring data ethics training is baked in to the core technology training of all Masters students, so they are asking all the right questions,” says Docherty."

Teaching ethics in computer science the right way with Georgia Tech's Charles Isbell; TechCrunch, September 5, 2019

Greg Epstein, TechCrunch; Teaching ethics in computer science the right way with Georgia Tech's Charles Isbell

"The new fall semester is upon us, and at elite private colleges and universities, it’s hard to find a trendier major than Computer Science. It’s also becoming more common for such institutions to prioritize integrating ethics into their CS studies, so students don’t just learn about how to build software, but whether or not they should build it in the first place. Of course, this begs questions about how much the ethics lessons such prestigious schools are teaching are actually making a positive impression on students.

But at a time when demand for qualified computer scientists is skyrocketing around the world and far exceeds supply, another kind of question might be even more important: Can computer science be transformed from a field largely led by elites into a profession that empowers vastly more working people, and one that trains them in a way that promotes ethics and an awareness of their impact on the world around them?

Enter Charles Isbell of Georgia Tech, a humble and unassuming star of inclusive and ethical computer science. Isbell, a longtime CS professor at Georgia Tech, enters this fall as the new Dean and John P. Imlay Chair of Georgia Tech’s rapidly expanding College of Computing."

His Cat’s Death Left Him Heartbroken. So He Cloned It.; The New York Times, September 4, 2019

, The New York Times; His Cat’s Death Left Him Heartbroken. So He Cloned It.

"China’s genetics know-how is growing rapidly. Ever since Chinese scientists cloned a female goat in 2000, they have succeeded in producing the world’s first primate clones, editing the embryos of monkeys to insert genes associated with autism and mental illness, and creating superstrong dogs by tinkering with their genes. Last year, the country stunned the world after a Chinese scientist announced that he had created the world’s first genetically edited babies.

Pet cloning is largely unregulated and controversial where it is done, but in China the barriers are especially low. Many Chinese people do not think that using animals for medical research or cosmetics testing is cruel, or that pet cloning is potentially problematic. There are also no laws against animal cruelty."

Wednesday, September 4, 2019

What's The Difference Between Compliance And Ethics?; Forbes, May 9, 2019

Bruce Weinstein, Forbes; What's The Difference Between Compliance And Ethics?

 "As important as both compliance and ethics are, ethics holds us to a higher standard, in my view. It's crucial to respect your institution's rules and policies, as well as the relevant laws and regulations, but your duties don't stop there.

The Ethics of Hiding Your Data From the Machines; Wired, August 22, 2019

Molly Wood, Wired;

The Ethics of Hiding Your Data From the Machines


"In the case of the company I met with, the data collection they’re doing is all good. They want every participant in their longitudinal labor study to opt in, and to be fully informed about what’s going to happen with the data about this most precious and scary and personal time in their lives.

But when I ask what’s going to happen if their company is ever sold, they go a little quiet."

Regulators Fine Google $170 Million for Violating Children’s Privacy on YouTube; The New York Times, September 4, 2019

Natasha Singer and , The New York Times;

Regulators Fine Google $170 Million for Violating Children’s Privacy on YouTube

 

"Google on Wednesday agreed to pay a record $170 million fine and to make changes to protect children’s privacy on YouTube, as regulators said the video site had knowingly and illegally harvested personal information from youngsters and used that data to profit by targeting them with ads.

The measures were part of a settlement with the Federal Trade Commission and New York’s attorney general. They said YouTube had violated a federal children’s privacy law known as the Children’s Online Privacy Protection Act, or COPPA."

'Sense of urgency', as top tech players seek AI ethical rules; techxplore.com, September 2, 2019

techxplore.com; 'Sense of urgency', as top tech players seek AI ethical rules

"Some two dozen high-ranking representatives of the global and Swiss economies, as well as scientists and academics, met in Geneva for the first Swiss Global Digital Summit aimed at seeking agreement on ethical guidelines to steer ...

Microsoft president Brad Smith insisted on the importance that "technology be guided by values, and that those values be translated into principles and that those principles be pursued by concrete steps."

"We are the first generation of people who have the power to build machines with the capability to make decisions that have in the past only been made by people," he told reporters.

He stressed the need for "transparency" and "accountability ... to ensure that the people who create technology, including at companies like the one I work for remain accountable to the public at large."

"We need to start taking steps (towards ethical standards) with a sense of urgency," he said."

MIT developed a course to teach tweens about the ethics of AI; Quartz, September 4, 2019

Jenny Anderson, Quartz; MIT developed a course to teach tweens about the ethics of AI

"This summer, Blakeley Payne, a graduate student at MIT, ran a week-long course on ethics in artificial intelligence for 10-14 year olds. In one exercise, she asked the group what they thought YouTube’s recommendation algorithm was used for.

“To get us to see more ads,” one student replied.

“These kids know way more than we give them credit for,” Payne said.

Payne created an open source, middle-school AI ethics curriculum to make kids aware of how AI systems mediate their everyday lives, from YouTube and Amazon’s Alexa to Google search and social media. By starting early, she hopes the kids will become more conscious of how AI is designed and how it can manipulate them. These lessons also help prepare them for the jobs of the future, and potentially become AI designers rather than just consumers."