Wednesday, October 2, 2019

Congress and Trump Agreed They Want a National Privacy Law. It Is Nowhere in Sight.; The New York Times, October 1, 2019

David McCabe, The New York Times;




"But after months of talks, a national privacy law is nowhere in sight...

The struggle to regulate consumer data shows how lawmakers have largely been unable to turn rage at Silicon Valley’s practices into concrete action... 

But the fervor to crack down on Silicon Valley has produced only a single new law, a bill to prevent sex trafficking online...

The United States has some laws that protect consumers’ privacy, like medical information collected by a doctor. But Congress has never set an overarching national standard for how most companies gather and use data. Regulators in Europe, in contrast, put strict new privacy rules into effect last year. 

Many tech companies built lucrative businesses off their users’ personal information, often by offering a “free” product in return.”

The Last Hope for Net Neutrality; Slate, October 1, 2019

April Glaser, Slate; The Last Hope for Net Neutrality

A federal appeals court upheld the FCC’s repeal of the open-internet rules. But it allowed for states to save them. 

 

"It’s confirmed: Net neutrality is legally dead. On Tuesday morning, a federal appeals court reaffirmed the Federal Communications Commission’s repeal of Obama-era net neutrality rules that prohibited internet providers from blocking, slowing down, or speeding up access to websites. In a 200-page decision, the judges on the U.S. Court of Appeals for the D.C. Circuit agreed with FCC Chairman Ajit Pai, who in 2017 vowed to “fire up a weed whacker” and destroy the regulations, which had only been on the books for about two years at the time.

 

While it’s been legal for internet providers to block access to websites since June 2018, when the FCC’s net neutrality repeal hit the books, advocates and website owners who depend on unfettered consumer access to the web were hopeful that the court would invalidate the repeal. Now, internet providers like Comcast, Verizon, and AT&T can do whatever they want with their customers’ connections and web access as long as they state that they reserve the right to do so in their terms of service. That doesn’t mean the internet is going to change tomorrow, or that Comcast will start throttling with abandon anytime soon. But by allowing telecom companies to sell faster speeds to the websites that can afford it, the deregulation threatens the ideal of the open web—a level playing field that allows anyone to build a website that can reach anyone. 

 

There is a significant silver lining in Tuesday’s ruling, however: The court struck down the part of the FCC’s 2017 rules that attempted to preempt state net neutrality rules. That reaffirms legislation and executive orders across the country that seek to preserve the pre-2017 status quo in which companies could not mess with websites’ and customers’ access to the internet. Nine states—Hawaii, Montana, New York, New Jersey, Washington, Rhode Island, California, Montana, and Vermont—have passed their own net neutrality rules. Another 27 states have seen legislation proposed to protect net neutrality. More than 100 mayors of cities across the country likewise have pledged not to sign contracts with internet providers that violate net-neutrality principles."

Tuesday, October 1, 2019

Roboethics: The Human Ethics Applied to Robots; Interesting Engineering, September 22, 2019

, Interesting Engineering; Roboethics: The Human Ethics Applied to Robots 

Who or what is going to be held responsible when or if an autonomous system malfunctions or harms humans?
"On ethics and roboethics 

Ethics is the branch of philosophy which studies human conduct, moral assessments, the concepts of good and evil, right and wrong, justice and injustice. The concept of roboethics brings up a fundamental ethical reflection that is related to particular issues and moral dilemmas generated by the development of robotic applications. 

Roboethics --also called machine ethics-- deals with the code of conduct that robotic designer engineers must implement in the Artificial Intelligence of a robot. Through this kind of artificial ethics, roboticists must guarantee that autonomous systems are going to be able to exhibit ethically acceptable behavior in situations where robots or any other autonomous systems such as autonomous vehicles interact with humans.

Ethical issues are going to continue to be on the rise as long as more advanced robotics come into the picture. In The Ethical Landscape of Robotics (PDF) by Pawel Lichocki et al., published by IEEE Robotics and Automation Magazine, the researchers list various ethical issues emerging in two sets of robotic applications: Service robots and lethal robots."

Metro’s ethics changes are welcome. But they’re only a start.; The Washington Post, September 29, 2019

Editorial Board, The Washington Post; Metro’s ethics changes are welcome. But they’re only a start.

"THE REPUTATION of former Metro chairman Jack Evans wasn’t the only thing that was tarnished amid the swirl of allegations that he used his public office to advance his private interests. Public trust in the Metro board was also badly shaken after it completely botched its handling of the allegations. It’s encouraging, then, that the board has taken a first step in its own rehabilitation by amending its code of ethics.
 
“The reforms will improve transparency, accountability and fairness of all parties,” board chairman Paul C. Smedberg said of revisions to the ethics policy that were approved on Thursday. The changes include a clearer definition of conflicts of interests, putting the transit agency’s inspector general in charge of investigations and opening the process to the public with requirements for written reports and discussions held in public."

A Teenager Killed Himself After Being Outed as Bisexual. His Family Wants Justice.; The New York Times, September 30, 2019

, The New York Times;

A Teenager Killed Himself After Being Outed as Bisexual. His Family Wants Justice.

The family and classmates of Channing Smith, a high school junior, said his death was a result of “social media bullying” and called for a thorough investigation.


"Channing’s death underscores the challenges of combating cyberbullying, which has proliferated in recent years. According to a report last year from the Pew Research Center, 59 percent of teenagers said they had been bullied or harassed online — and many of them thought teachers, social media companies and politicians were failing to help.

In schools across the country, L.G.B.T. students are more likely to be bullied and experience depression than their straight peers, studies have found."

Monday, September 30, 2019

For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection; Undark, September 30, 2019

, Undark; For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection

"Research to capture these snapshots, called genome-wide association studies, can only draw conclusions about the data that’s been collected. Without studies that look at each underrepresented population, genetic tests and therapies can’t be tailored to everyone. Still, projects intended as correctives, like All of Us and the International HapMap Project, face an ethical conundrum: Collecting that data could exploit the very people the programs intend to help."

Wednesday, September 25, 2019

‘Nerd,’ ‘Nonsmoker,’ ‘Wrongdoer’: How Might A.I. Label You?; The New York Times, September 20, 2019

, The New York Times; ‘Nerd,’ ‘Nonsmoker,’ ‘Wrongdoer’: How Might A.I. Label You?

ImageNet Roulette, a digital art project and viral selfie app, exposes how biases have crept into the artificial-intelligence technologies changing our lives.

"But for Mr. Paglen, a larger issue looms. The fundamental truth is that A.I. learns from humans — and humans are biased creatures. “The way we classify images is a product of our worldview,” he said. “Any kind of classification system is always going to reflect the values of the person doing the classifying.”"

Tuesday, September 24, 2019

For authoritarians, stifling the arts is of a piece with demonising minorities; The Guardian, September 22, 2019

Elif Shafak, The Guardian; For authoritarians, stifling the arts is of a piece with demonising minorities

"Art and literature are – and will be increasingly – at the centre of the new culture wars. What might seem like sporadic, disconnected incidents here and there are in truth manifestations of a similar mindset, a growing wave of bigotry...

In country after country, we have seen enough examples to understand that a divisive and aggressive rhetoric against LGBT and women’s rights is intrinsic to the rise of populist nationalism and populist authoritarianism.

Top-down censorship and the control of art and literature are inseparable components of the hatred and discrimination against sexual minorities, as well as against immigrants and intellectuals. Artists and writers cannot afford not to know what is happening to colleagues in other parts of the world. We cannot afford to be silent."

Kona Stories Book Store to Celebrate Banned Book Week; Big Island Now, September 21, 2019

Big Island Now; Kona Stories Book Store to Celebrate Banned Book Week

"At its heart, Banned Books Week is a celebration of the freedom to access ideas, a fundamental right that belongs to everyone and over which no one person or small group of people should hold sway.

Banned Books Week is a celebration of books, comics, plays, art, journalism and much more. 

At Kona Stories Book Store, books have been wrapped in paper bags to disguise the title. Books are decorated with red “I read banned books” stickers and a brief description of why they are on the list.

Customers are encouraged to buy the books without knowing the titles."

Harry Potter and the Poorly-Read Exorcists; The New York Times, September 23, 2019

, The New York Times; Harry Potter and the Poorly-Read Exorcists

"Little surprise, then, that two decades of efforts to protect children from imaginary spells have made no difference at all. Harry Potter titles have sold more the 500 million copies worldwide.

As it happens, this is Banned Books Week in the United States, so the timing of Father Rehill’s ban is richly ironic, but Harry and his friends are no longer the chief targets of book-banning adults, presumably because most adults are now aware that attempting to keep children from reading Harry Potter is about as effective as banning air."


Monday, September 23, 2019

M.I.T. Media Lab, Already Rattled by the Epstein Scandal, Has a New Worry; The New York Times, September 22, 2019

, The New York Times; M.I.T. Media Lab, Already Rattled by the Epstein Scandal, Has a New Worry

"Four researchers who worked on OpenAg said in interviews with The New York Times that Mr. Harper had made exaggerated or false claims about the project to its corporate sponsors, a group that included the retail giant Target, as well as in interviews with the news media."

Manifesto Promotes ‘Ethics, Equity, and Empathy’; STREETSBLOGUSA,September 20, 2019



Manifesto Promotes ‘Ethics, Equity, and Empathy’


A design firm publishes a new credo for engineers, policymakers, and planners.

"Maryland-based design firm is seeking to revolutionize the century-old credo that shapes how policymakers and engineers plan communities — in order to force planners to prioritize human beings over automobiles and think deeply about how their decisions affect road safety. 

Toole Design, which has 17 offices in the United States and Canada, last week released a manifesto that seeks to substitute new concepts for the traditional “three Es” — education, enforcement, and engineering — that have guided transportation professionals as they have built the infrastructure of our towns and cities.

The new “three Es” that Toole proposes — “ethics, equity, and empathy”  — replace the object- and rule-centered approach that dominates the discipline with a moral one centered on people."



Saturday, September 21, 2019

'The Personification Of Human Decency': Nina Totenberg Remembers Cokie Roberts; NPR, September 17, 2019

Nina Totenberg, NPR; 'The Personification Of Human Decency': Nina Totenberg Remembers Cokie Roberts

"To know Cokie was to see the personification of human decency. There is a reason she was asked to speak at so many funerals. People felt such a deep connection to her because she touched their lives. Casual friends would find Cokie visiting them in the hospital. People in terrible financial straits would find her bailing them out, hiring them for work that perhaps she did not need, but work that left them with their dignity...

On a larger scale, she was always the voice of people with less power, and the voice of what is right. I remember one day many years ago, when we were in negotiations with NPR management over a labor contract. Management didn't want to extend health care coverage to one group, and we were at an impasse.

Then Cokie, who was working on a piece of embroidery, looked up at the management team and said, "You know, the position you are taking isn't immoral, it is simply amoral." The room got very quiet, and soon the impasse was over."

Banned Books Week Celebrates 'Freedom To Read'; WESA, Pittsburgh's NPR News Station, September 18, 2019

WESA, Pittsburgh's NPR News Station; Banned Books Week Celebrates 'Freedom To Read'

"Christy Bostardi is a member of Book Fairies of Pennsylvania, which hides books around the city for strangers to find and read. She says banning books also has educational implications. 

"We read books for personal reasons... to explore the outside world and learn." Removing books from shelves, she says, prohibits us from being an informed society, but it can have an inverse effect, especially for children, who get excited by the idea of reading a banned book.

The ACLU is also partnering with CMU and the Carnegie Library system to celebrate the "freedom to read" at the Carnegie Lecture Hall in Oakland on Tuesday."

Public libraries across the country face a different type of censorship; Tennessean, September 20, 2019

Kent Oliver, Tennessean; Public libraries across the country face a different type of censorship

"Book censorship impedes access to literature and information. For a public library such as Nashville Public Library, unfettered, undiscriminating access to reading is the core of our work – it is in our library DNA. Access is the key word. Librarians work to ensure access to ideas, popular and unpopular, within financial constraints.

The disturbing issue confronting us for this year’s Banned Books Week, Sep. 22-28, is the restrictions publishers are placing on public libraries making it more difficult to buy e-books and e-audiobooks. In particular, libraries are concerned about a new e-book embargo from Macmillan, one of the biggest book publishers in the industry, set to launch Nov. 1.

Under this new policy, libraries will be limited to purchasing (closer to leasing, really) one copy of a new e-book for eight weeks after release, when demand is at its peak."
 

Americans’ perceptions about unethical behavior shape how they think about people in powerful roles; Pew Research Center, September 19, 2019

Claire Gecewicz and Lee Rainie, Pew Research Center; Americans’ perceptions about unethical behavior shape how they think about people in powerful roles

"Americans have mixed views of the job performance of those who hold positions of power and responsibility in eight major U.S. groups and institutions. A key element in shaping these views is their sense of whether members of these groups act ethically and hold themselves accountable for their mistakes, according to a new Pew Research Center survey.

The groups studied included members of Congress, local elected officials, leaders of technology companies, journalists, religious leaders, police officers, principals at public K-12 schools and military leaders."

Friday, September 20, 2019

People Are Looking At Your LinkedIn Profile. They Might Be Chinese Spies; NPR, September 19, 2019

Ryan Lucas, NPR; People Are Looking At Your LinkedIn Profile. They Might Be Chinese Spies

"Demers took over leadership of the National Security Division in February 2018 after being confirmed by the Senate. Since taking the helm, he has spent a considerable amount of time on China and what he calls its prolific espionage efforts against the United States.

They're vast in scale, he said, and they span the spectrum from traditional espionage targeting government secrets to economic espionage going after intellectual property and American trade secrets...

It's a play that has also been used to target folks in the business world and academia, where China is hungry for cutting-edge technology and trade secrets. For years, the Chinese intelligence services have hacked into U.S. companies and made off with intellectual property.

Now, U.S. officials say China's spies are increasingly turning to what is known as "nontraditional collectors" — students, researchers and business insiders — to scoop up secrets."

Thursday, September 19, 2019

Time to Build a National Data Broker Registry; The New York Times, September 13, 2019

; Time to Build a National Data Broker Registry

A federal clearing house would help separate the good and bad actors who deal in data.

"It’s time for a national data privacy law, one that gives consumers meaningful rights — to know who has their data, how it is used and how to opt out. It’s in our country’s best interest to have a national standard that, done thoughtfully, benefits both consumers and businesses by providing transparency, uniformity and certainty without deterring innovation and competition...

There is much work to do to ensure the ethical use of information in our economy. With a concerted effort to engage in mutual understanding, we can address consumer privacy on the one hand, while supporting the inventive, valuable and responsible uses of data on the other.

Jordan Abbott is the chief data ethics officer for Acxiom."

AI - ethics within data protections legal framework; Lexology, September 4, 2019


"Privacy and ethics: hand in hand

AI relies on huge volumes of data and often that data contains personal data. But processing such large volumes of data could be at odds with the legal framework contained in the General Data Protection Regulation and the Data Protection Act 2018, which is based on ethical principles. As Elizabeth Denham, the Information Commissioner, stated: "Ethics is at the root of privacy and is the future of data protection".

Recent headlines draw out some of the main ethical and data protection challenges raised by AI. Interesting examples include the babysitter vetting app, Predictim, designed to create a risk rating based on a potential babysitter's social media and web presence (without his or her knowledge). China's social credit scoring system is another example of how significant decisions are being made automatically about people without any human input and without transparency, explanation or recourse."

UVA receives approval to form School of Data Science with $120M gift; WDBJ7, September 19, 2019

WDBJ7; UVA receives approval to form School of Data Science with $120M gift

"Philip E. Bourne, Professor and Data Science Institute Director, was appointed dean of the School of Data Science by Provost Magill immediately after the vote conferred official status upon the school...

“We envision the new School of Data Science at UVA as a ‘school without walls,’” Bourne said. “In its very structure and foundation, we will build collaborative and interdisciplinary opportunities through partnerships and physical spaces for shared research and education programs. The new school will combine a focus on research, education, and service to build bridges across academic, geographic, commercial and cultural boundaries using responsible, open data science.”

The school also will focus on ethics and the practice of responsible data science, building upon the Data Science Institute’s existing Center for Data Ethics and Justice.

“Data science offers incredible, revolutionary opportunities to understand and make an impact on our world and our future,” Bourne said. “Now it is more important than ever that everyone using those skills and tools – from students just beginning to learn statistics and programming, to leaders working at the cutting edge of the field – understands the importance of using data ethically and responsibly, and putting their skills to work to make a positive impact on society and our world.”"

Wednesday, September 18, 2019

University Launches Ethics-Forward Data Science Major; Washington Square News, NYU's Independent Student Newspaper, September 16, 2019

Akiva Thalheim, Washington Square News, NYU's Independent Student Newspaper; University Launches Ethics-Forward Data Science Major

"The new major seeks to specifically address and prevent these issues through a required course in the ethics of data science, [Center for Data Science Director Julia] Kempe explained. She added that the course was developed with the assistance of a National Science Foundation grant.

“We are hoping to educate young people to be data savvy and also data critical, because nowadays, everything is about data but often it’s done in a very uncritical way,” Kempe said. “We have to understand where the biases are [and] how to use data ethically — it’s something that we want to impart on every student, if we can.""

Tuesday, September 17, 2019

There’s a reason we don’t know much about AI; Politico, September 16, 2019

Arthur Allen, Politico; There’s a reason we don’t know much about AI

The U.S. government used to study new technologies to flag ethical or social problems. Not anymore. 

"Though many university professors and tech-funded think tanks are examining the ethical, social and legal implications of technologies like Big Data and machine learning, “it’s definitely happening outside the policy infrastructure,” said John Wilbanks, a philosopher and technologist at the Sage Bionetworks research group.

Yet these technologies could have profound effects on our future, and they pose enormous questions for society...

As it happens, there is a good precedent for the federal government stepping up to examine the ethical and legal issues around an important new technology. Starting in 1990, the National Institutes of Health set aside 5 percent of the funding for its Human Genome Project for a program known as ELSI—which stood for the ethical, legal and social implications of genetics research.

The ELSI program, which started 30 years ago, “was a symbol that NIH thought the ethical issues were so important in genomics that they’d spend a lot of money on them,” says Isaac Kohane, chief of the Harvard Medical School’s Department of Biomedical Informatics. “It gave other genetics researchers a heads-up—police your ethics, we care about them.”

ELSI’s premise was to have smart social scientists weigh the pros and cons of genetic technology before they emerged, instead of, “Oops, we let the genie out of the bottle,” said Larry Brody, director of Genomics and Society program at the National Human Genome Research Institute."

Real-Time Surveillance Will Test the British Tolerance for Cameras; The New York Times, September 15, 2019

, The New York Times; Real-Time Surveillance Will Test the British Tolerance for Cameras

Facial recognition technology is drawing scrutiny in a country more accustomed to surveillance than any other Western democracy. 

"“Technology is driving forward, and legislation and regulation follows ever so slowly behind,” said Tony Porter, Britain’s surveillance camera commissioner, who oversees compliance with the country’s surveillance camera code of practice. “It would be wrong for me to suggest the balance is right.”

Britain’s experience mirrors debates about the technology in the United States and elsewhere in Europe. Critics say the technology is an intrusion of privacy, akin to constant identification checks of an unsuspecting public, and has questionable accuracy, particularly at identifying people who aren’t white men."

Steal This Book? There’s a Price; The New York Times, September 15, 2019

, The New York Times; Steal This Book? There’s a Price

I have about 400 offers to buy illegal copies of my own work. Something is very wrong. 

"Maybe, though, it’s too narrow to focus on the way our society has discounted its authors. No doubt musicians, and local retailers, and hometown newspapers, and schoolteachers, and factory workers all feel discounted in much the same way. We have surrendered our lives to technocrat billionaires who once upon a time set out to do no harm and have instead ended up destroying the world as we knew it. Convenience and the lowest possible price, or no price at all, have become our defining values. We have severed ourselves from our communities and from the mutual give-and-take that was once our ordinary daily life. Now we sit alone in our rooms, restlessly scrolling for something free to read."

AI and ethics: The debate that needs to be had; ZDNet, September 16, 2019

, ZDNet; AI and ethics: The debate that needs to be had

Like anything, frameworks and boundaries need to be set -- and artificial intelligence should be no different.

"Building ethical AI with diversity

Part of the solution to help overcome these systemic biases that are built into existing AI systems, according to Lazar, is to have open conversations about ethics -- with input from diverse views in terms of culture, gender, age, and socio-economic background -- and how it could be applied to AI.

"What we need to do is figure out how to develop systems that incorporate democratic values and we need to start the discussion within Australian society about what we want those values to be," he said."

TikTok’s Beijing roots fuel censorship suspicion as it builds a huge U.S. audience; The Washington Post, September 15, 2019

Drew Harwell and Tony Romm, The Washington Post; TikTok’s Beijing roots fuel censorship suspicion as it builds a huge U.S. audience

"TikTok’s surging popularity spotlights the tension between the Web’s global powers: the United States, where free speech and competing ideologies are held as (sometimes messy) societal bedrocks, and China, where political criticism is forbidden as troublemaking."

Artificial intelligence in medicine raises legal and ethical concerns; The Conversation, September 4, 2019

, The Conversation; Artificial intelligence in medicine raises legal and ethical concerns

"The use of artificial intelligence in medicine is generating great excitement and hope for treatment advances.

AI generally refers to computers’ ability to mimic human intelligence and to learn. For example, by using machine learning, scientists are working to develop algorithms that will help them make decisions about cancer treatment. They hope that computers will be able to analyze radiological images and discern which cancerous tumors will respond well to chemotherapy and which will not.

But AI in medicine also raises significant legal and ethical challenges. Several of these are concerns about privacy, discrimination, psychological harm and the physician-patient relationship. In a forthcoming article, I argue that policymakers should establish a number of safeguards around AI, much as they did when genetic testing became commonplace."

Monday, September 16, 2019

Maths and tech specialists need Hippocratic oath, says academic; The Guardian, August 16, 2019

Ian Sample, The Guardian; Maths and tech specialists need Hippocratic oath, says academic

"“We need a Hippocratic oath in the same way it exists for medicine,” Fry said. “In medicine, you learn about ethics from day one. In mathematics, it’s a bolt-on at best. It has to be there from day one and at the forefront of your mind in every step you take.”...

The genetics testing firm 23andMe was a case in point, she said.

“We literally hand over our most private data, our DNA, but we’re not just consenting for ourselves, we are consenting for our children, and our children’s children. Maybe we don’t live in a world where people are genetically discriminated against now, but who’s to say in 100 years that we won’t? And we are are paying to add our DNA to that dataset.”"

Sunday, September 15, 2019

Northeastern researchers team up with Accenture to offer a road map for artificial intelligence ethics oversight; Northeastern University, August 29, 2019

Khalida Sarwari, Northeastern University; Northeastern researchers team up with Accenture to offer a road map for artificial intelligence ethics oversight

"Now, Northeastern professors John Basl and Ron Sandler are offering organizations guidance for how to create a well-designed and effective committee based on similar models used in biomedical research. 

Maintaining that an ethics committee that is adequately resourced and thoughtfully designed can play an important role in mitigating digital risks and maintaining trust between an organization and the public, the researchers provide a framework for such a system in a new report produced in collaboration with global professional services company Accenture...

“If you want to build a committee that works effectively and if you really want to build ethical capacity within an organization, it’s a significant undertaking where you can’t just throw together a few people with ethical expertise,” says Sandler.

Added Basl: “We lay out the kinds of experts an organization will need—someone who knows local laws, someone who knows ethics, a variety of technical experts, and members of an affected community. Who those individuals are, or what their particular expertise is, depends on the kind of technology being developed and deployed.”"

Building data and AI ethics committees; Accenture.com, August 20, 2019

Accenture.com; Building data and AI ethics committees

"In brief
  • Organizations face a difficult challenge when it comes to ethically-informed data collection, sharing and use.
  • There is growing demand for incorporating ethical considerations into products and services involving big data, AI and machine learning.
  • Outside of mere legal compliance, there is little guidance on how to incorporate this ethical consideration.
  • To fill this gap, Northeastern University and Accenture explore the development of effective and well-functioning data and AI ethics committees."

Saturday, September 14, 2019

How to Build an AI Ethics Committee; The Wall Street Journal, August 30, 2019

Jared Council, The Wall Street Journal; How to Build an AI Ethics Committee

New road map provides guidelines for starting an ethics committee for artificial-intelligence and data concerns


"A new guidebook aims to help organizations set up data and artificial-intelligence ethics committees and better deal with the ethical issues associated with the technology."

CS department hires Ethics TAs; The Brown Daily Herald, September 5, 2019

Sarah Wang, The Brown Daily Herald;

CS department hires Ethics TAs

Newly-hired teaching assistants to integrate ethics into computer science 

 

"Last spring, the Department of Computer Science announced the inaugural hiring of 10 Ethics Teaching Assistants, who will develop and deliver curricula around ethics and society in five of the department’s largest courses. 

The department created the ETA program to acknowledge the impact that the products and services created by computer scientists have on society, said Professor of Computer Science and Department Chair Ugur Cetintemel. Cetintemel, who helped spearhead the program, said that it was important for concentrators to think critically about the usage and possible misuse of the solutions they build. “We want our concentrators to think about the ethical and societal implications of what they do, not as an afterthought but as another fundamental dimension they should consider as they develop their work,” he said...

The CS department is already incoorporating ethics into its curriculum through multiple courses such as CSCI 1951I: “CS for Social Change,” but the department hopes ETAs will encourage students to view the topic as a more fundamental aspect of CS."