Thursday, November 14, 2019

AI and gene-editing pioneers to discuss ethics at Stanford; Stanford News, November 12, 2019

Ker Than, Stanford News; AI and gene-editing pioneers to discuss ethics at Stanford

"Upon meeting for the first time at a dinner at Stanford earlier this year, Fei-Fei Li and Jennifer Doudna couldn’t help but note the remarkable parallels in their experiences as scientists.

Both women helped kickstart twin revolutions that are profoundly reshaping society in the 21st century – Li in the field of artificial intelligence (AI) and Doudna in the life sciences. Both revolutions can be traced back to 2012, the year that computer scientists collectively recognized the power of Li’s approach to training computer vision algorithms and that Doudna drew attention to a new gene-editing tool known as CRISPR-Cas9 (“CRISPR” for short). Both pioneering scientists are also driven by a growing urgency to raise awareness about the ethical dangers of the technologies they helped create."

Why Businesses Should Adopt an AI Code of Ethics -- Now; InformationWeek, November 14, 2019

Gary Grossman, InformationWeek; Why Businesses Should Adopt an AI Code of Ethics -- Now

"Adherence to AI ethics breeds trust

According to Angel Gurria, Secretary-General of the Organization for Economic Co-Operation and Development (OECD): “To realize the full potential of [AI] technology, we need one critical ingredient. That critical ingredient is trust. And to build trust we need human-centered artificial intelligence that fosters sustainable development and inclusive human progress.” To achieve this, he adds that there must be an ethical dimension to AI use. This all underscores the urgency for companies to create and live by a responsible AI code of ethics to govern decisions about AI development and deployment.

The EU has developed principles for ethical AI, as has the IEEE, Google, Microsoft, Intel, Tencent and other countries and corporations. As these have appeared in only the last couple of years, AI ethics is very much an evolving field. There is an opportunity and critical need for businesses to lead by creating their own set of principles embodied in an AI code of ethics to govern their AI research and development to both further the technology while also helping to create a better tomorrow."

The Ethical Dilemma at the Heart of Big Tech Companies; Harvard Business Review, November 14, 2019


"The central challenge ethics owners are grappling with is negotiating between external pressures to respond to ethical crises at the same time that they must be responsive to the internal logics of their companies and the industry. On the one hand, external criticisms push them toward challenging core business practices and priorities. On the other hand, the logics of Silicon Valley, and of business more generally, create pressures to establish or restore predictable processes and outcomes that still serve the bottom line.

We identified three distinct logics that characterize this tension between internal and external pressures..."

I'm the Google whistleblower. The medical data of millions of Americans is at risk; The Guardian, November 14, 2019

Anonymous, The Guardian; I'm the Google whistleblower. The medical data of millions of Americans is at risk

"After a while I reached a point that I suspect is familiar to most whistleblowers, where what I was witnessing was too important for me to remain silent. Two simple questions kept hounding me: did patients know about the transfer of their data to the tech giant? Should they be informed and given a chance to opt in or out?

The answer to the first question quickly became apparent: no. The answer to the second I became increasingly convinced about: yes. Put the two together, and how could I say nothing?

So much is at stake. Data security is important in any field, but when that data relates to the personal details of an individual’s health, it is of the utmost importance as this is the last frontier of data privacy.

With a deal as sensitive as the transfer of the personal data of more than 50 million Americans to Google the oversight should be extensive. Every aspect needed to be pored over to ensure that it complied with federal rules controlling the confidential handling of protected health information under the 1996 HIPAA legislation."

Wednesday, November 13, 2019

GQ, Nat Geo and Cosmo are banned in Arizona prisons. A judge said the rules need to explain why.; The Washington Post, November 12, 2019

Lateshia Beachum, The Washington Post; GQ, Nat Geo and Cosmo are banned in Arizona prisons. A judge said the rules need to explain why.

"The Arizona Department of Corrections must define clear rules about what prisoners can read, according to a district judge...

The judge’s decision underscores the problem of censoring inmate reading material and the indeterminate manner in which jails and prisons prohibit or grant what incarcerated people can read, prisoner rights advocates say."

Senior Trump official embellished résumé, had face on fake Time cover; NBC News, November 12, 2019

Dan De Luce, Laura Strickler and Ari Sen, NBC News; Senior Trump official embellished résumé, had face on fake Time cover

"A senior Trump administration official has embellished her résumé with misleading claims about her professional background — even creating a fake Time magazine cover with her face on it — raising questions about her qualifications to hold a top position at the State Department. 

An NBC News investigation found that Mina Chang, the deputy assistant secretary in the State Department's Bureau of Conflict and Stability Operations, has inflated her educational achievements and exaggerated the scope of her nonprofit's work.

She was being considered for an even bigger government job, one with a budget of more than $1 billion, until Congress started asking questions about her résumé."

Engineers need a required course in ethics; Quartz, November 8, 2019

Kush Saxena; Engineers need a required course in ethics

"The higher education sector cannot ignore its role in preparing students for the future of work as quite literally the purpose it serves. That includes integrating ethics into comprehensive computer science curricula.

Universities like MIT are leading the way by creating research collaborations across disciplines such as law and government, finding ways to embed topics around the societal impact of computing into the technical curriculum.

This type of rigorous education shouldn’t be accessible only to students who can get into elite universities. As more jobs require engineering skills, all institutions—from coding boot camps to community college courses to advanced state-funded PhD programs—need to follow suit."

Tuesday, November 12, 2019

Whodunit in the Library: Someone Keeps Hiding the Anti-Trump Books; The New York Times, November 10, 2019

, The New York Times;Whodunit in the Library: Someone Keeps Hiding the Anti-Trump Books

"Ms. Ammon said she asked the caller to provide a list of books that should be in the stacks, and while the person failed to provide one, she suspects the library already has whatever might be on it.

“We serve the entire community,” Ms. Ammon said...

Through it all, Ms. Ammon said, the library has managed to maintain the diversity of its shelves. In the nonfiction stacks, a book by Al Franken, the former Democratic senator, sits right next to one by Newt Gingrich, the former Republican congressman.

“The Dewey decimal system is a great equalizer,” Ms. Ammon said."

Citrus County, Fla., leaders don’t like the New York Times. That doesn’t justify keeping it out of their libraries.; The Washington Post, November 9, 2019

Anthony Marx, The Washington Post; Citrus County, Fla., leaders don’t like the New York Times. That doesn’t justify keeping it out of their libraries.

"Libraries have always been at the foundation of our democracy, existing to arm the public with the information people need to make informed decisions and reach logical conclusions. Libraries are trusted sources of this information precisely because they do not judge, reject or accept the information based on politics or opinion. They provide it, and as technology has changed over the decades, they adapt to provide it the way patrons want it.

Despite all of this, the commissioners of Citrus County chose to reject the library system’s request. I am well aware of the financial difficulties public libraries face...

This is a misguided decision. Public libraries are not in place to further the political agenda of any party or position. Whether those holding the purse strings favor any particular journal or not should be irrelevant. At the New York Public Library, we offer hundreds of periodicals from all sides of the political aisle. I will tell you that I do not agree with the political slant of some of the papers we make available. And that is exactly how it should be.

Libraries give members of the public the tools they need to be fully informed participants in civic society."

Thursday, November 7, 2019

Hunters Point Library Confronts Accessibility Issues; Library Journal, November 4, 2019

Lisa Peet, Library Journal; Hunters Point Library Confronts Accessibility Issues

"QPL is assessing the situation with the Department of Design and Construction and Steven Holl Architects, de Bourbon said. (As of press time, Steven Holl Architects had not responded to LJ’s request for comment.) “As we move forward with new projects,” she said, “we will be even more proactive in addressing the needs and circumstances of every single customer.”

“I hope that libraries who are working on inclusiveness can see this as a cautionary tale,” said Machones. “There clearly needs to be more oversight in all stages of planning to ensure nothing like this happens again. There needs to be opportunities for staff and the community to analyze and respond to plans at every stage. If there are members of your community that are not able to participate in input sessions, then go to them and ask them for their input. Your library will better serve the community if your plans reflect everyone in it.”

Such inclusive input might be positioned as a mandate in all aspects of service for the library, Machones suggested. “I would have regular community conversations to learn about what ways the library could improve. I also would recommend the library undergo an inclusive services assessment,” such as the Inclusive Services Assessment and Guide developed for Wisconsin Public Libraries."

NNS Spotlight: Nonprofit uses data research to spur change in communities; Milwaukee Neighborhood News Service (NNS), November 6, 2019

, Milwaukee Neighborhood News Service (NNS); NNS Spotlight: Nonprofit uses data research to spur change in communities

"Numbers can tell only part of a story.

They mean nothing without context.

And that’s where Data You Can Use steps in. The nonprofit works to provide useful local data so organizations can create change on a community level.

“In some of these neighborhoods, people have a fear of research because they’ve always been the subject, but they never see the results. That can be very damaging,” said Katie Pritchard, executive director and president of Data You Can Use. “If you’re only telling one part of the story, it doesn’t help anyone.”...

“We wanted to find a better way to measure the impact of what we do,” [Barb] Wesson [the outcomes manager] said. “One of the things Data You Can Use does really well that I don’t do at all is qualitative data analysis, and that’s what we needed.”"

Commissioners call New York Times ‘fake news,’ deny library funding for digital subscriptions; Tampa Bay Times, November 4, 2019

, Tampa Bay Times; Commissioners call New York Times ‘fake news,’ deny library funding for digital subscriptions

"Sandy Price, the advisory board chairman for the county’s libraries, told the Citrus County Chronicle she was disappointed with the commissioners’ decision to block the funding. She also said she was concerned with the reason behind the blocking, specifically citing Carnahan’s comments about the New York Times being ‘fake news.’

“Someone’s personal political view does not have a place in deciding what library resources are available for the entire county,” Price told the Chronicle on Monday. “Libraries have to ensure all points of view are represented.”

Despite all five commissioners railing against the request during the commission meeting, feedback that he received in the days after caused Commissioner Brian Coleman to loosen his stance on the request.

Coleman originally said of the funding request: “I support President Trump. I would say they put stuff in there that’s not necessarily verified."

Two days after the meeting, however, Coleman told The Chronicle he wanted to re-address the topic at a future meeting.

“Our decision should have been impartial, instead of having it become a personal thing," Coleman told the Chronicle."

What if "Sesame Street" Were Open Access?; Electronic Frontier Foundation (EFF), October 25, 2019

Elliot Harmon, Electronic Frontier Foundation (EFF); What if "Sesame Street" Were Open Access?

"The news of iconic children’s television show “Sesame Street”’s new arrangement with the HBO MAX streaming service has sent ripples around the Internet. Starting this year, episodes of “Sesame Street” will debut on HBO and on the HBO MAX service, with new episodes being made available to PBS “at some point.” Parents Television Council’s Tim Winter recently told New York Times that “HBO is holding hostage underprivileged families” who can no longer afford to watch new “Sesame Street” episodes.

The move is particularly galling because the show is partially paid for with public funding. Let's imagine an alternative: what if “Sesame Street” were open access? What if the show’s funding had come with a requirement that it be made available to the public?"

Backcountry.com breaks its silence amid trademark lawsuit controversy to apologize and say “we made a mistake”; The Colorado Sun, November 6, 2019

Jason Blevins, The Colorado Sun; Backcountry.com breaks its silence amid trademark lawsuit controversy to apologize and say “we made a mistake”

"“To be fair, this is not about Marquette Backcountry Skis. It’s about the small nonprofits, it’s about the guides and the small businesses they targeted. This has all been about the lawsuits filed against the people in front of me and the ones coming for the people behind me,” [David] Ollila said. “What we’ve witnessed here is that it takes 25 years to build a business and a reputation and it can be lost very quickly with these poor decisions. I wonder how the market will react to this. I wonder if they can be forgiven.”...

“This boycott isn’t about a word,” [Jon Miller] said. “What is happening is that a corporation has a stranglehold over our culture in a battle over a word they literally don’t even own.”"

Trump administration sues drugmaker Gilead Sciences over patent on Truvada for HIV prevention; The Washington Post, November 7, 2019

Christopher Rowland, The Washington Post; Trump administration sues drugmaker Gilead Sciences over patent on Truvada for HIV prevention

"The Trump administration took the rare step Wednesday of filing a patent infringement lawsuit against pharmaceutical manufacturer Gilead Sciences over sales of Truvada for HIV prevention, a crucial therapy invented and patented by Centers for Disease Control researchers."

Wednesday, November 6, 2019

Rights group files federal complaint against AI-hiring firm HireVue, citing ‘unfair and deceptive’ practices; The Washington Post, November 6, 2019

Drew Harwell, The Washington Post; Rights group files federal complaint against AI-hiring firm HireVue, citing ‘unfair and deceptive’ practices

"The Electronic Privacy Information Center, known as EPIC, on Wednesday filed an official complaint calling on the FTC to investigate HireVue’s business practices, saying the company’s use of unproven artificial intelligence systems that scan people’s faces and voices constituted a wide-scale threat to American workers."

After food-delivery robots were benched, Pitt tests putting them back on the sidewalks; The Pittsburgh Post-Gazette, November 1, 2019

Bill Schackner, The Pittsburgh Post-Gazette; After food-delivery robots were benched, Pitt tests putting them back on the sidewalks

"Last week, the San Francisco-based company and dining services vendor Sodexo resumed testing. Earlier, the gizmos were pulled from the streets after a doctoral student using a wheelchair said one blocked her access to a sidewalk on Forbes Avenue.

Pitt spokesman Kevin Zwick said officials are still hoping to formally debut the robots this fall. A handful of other universities nationwide are using the Starship robots...

The 2-foot tall, six-wheeled devices that resemble rolling coolers are programmed to navigate the campus and can be operated remotely as they cross streets around campus. The plans have drawn a range of reactions from those who depend on takeout or find themselves navigating crowded streets with the bots."

For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection; Undark, September 30, 2019

Adrian Pecotic, Undark; For Vulnerable Populations, the Thorny Ethics of Genetic Data Collection

To be equitable, genetics research needs more diverse samples. But collecting that data could present ethical issues.

"“When we do genetic studies, trying to understand the genetic basis of common and complex diseases, we’re getting a biased snapshot,” said Alicia Martin, a geneticist at the Massachusetts General Hospital and the Broad Institute, a biomedical and genomics research center affiliated with Harvard and MIT.

Research to capture these snapshots, called genome-wide association studies, can only draw conclusions about the data that’s been collected. Without studies that look at each underrepresented population, genetic tests and therapies can’t be tailored to everyone. Still, projects intended as correctives, like All of Us and the International HapMap Project, face an ethical conundrum: Collecting that data could exploit the very people the programs intend to help."

How Machine Learning Pushes Us to Define Fairness; Harvard Business Review, November 6, 2019

David Weinberger, Harvard Business Review; How Machine Learning Pushes Us to Define Fairness

"Even with the greatest of care, an ML system might find biased patterns so subtle and complex that they hide from the best-intentioned human attention. Hence the necessary current focus among computer scientists, policy makers, and anyone concerned with social justice on how to keep bias out of AI. 

Yet machine learning’s very nature may also be bringing us to think about fairness in new and productive ways. Our encounters with machine learning (ML) are beginning to  give us concepts, a vocabulary, and tools that enable us to address questions of bias and fairness more directly and precisely than before."

Elisa Celis and the fight for fairness in artificial intelligence; Yale News, November 6, 2019

Jim Shelton, Yale News; Elisa Celis and the fight for fairness in artificial intelligence

"What can you tell us about the new undergraduate course you’re teaching at Yale?

It’s called “Data Science Ethics.” I came in with an idea of what I wanted to do, but I also wanted to incorporate a lot of feedback from students. The first week was spent asking: “What is normative ethics? How do we even go about thinking in terms of ethical decisions in this context?” With that foundation, we began talking about different areas where ethical questions come out, throughout the entire data science pipeline. Everything from how you collect data to the algorithms themselves and how they end up encoding these biases, and how the results of biased algorithms directly affect people. The goal is to introduce students to all the things they should have in their mind when talking about ethics in the technical sphere.

The class doesn’t require coding or technical background, because that allows students from other departments to participate. We have students from anthropology, sociology, and economics, and other departments, which broadens the discussion. That’s very valuable when grappling with these inherently interdisciplinary problems."

A library wanted a New York Times subscription. Officials refused, citing Trump and ‘fake news.’; The New York Times, November 5, 2019

Antonia Noori Farzan, The Washington Post; A library wanted a New York Times subscription. Officials refused, citing Trump and ‘fake news.’

"“Someone’s personal political view does not have a place in deciding what library resources are available for the entire county,” Sandy Price, the chairwoman for the library’s advisory board, told the Chronicle. “Libraries have to ensure all points of view are represented.”"

Monday, November 4, 2019

Scientists With Links to China May Be Stealing Biomedical Research, U.S. Says; The New York Times, November 4, 2019

, The New York Times; Scientists With Links to China May Be Stealing Biomedical Research, U.S. Says
 
"The investigations have fanned fears that China is exploiting the relative openness of the American scientific system to engage in wholesale economic espionage. At the same time, the scale of the dragnet has sent a tremor through the ranks of biomedical researchers, some of whom say ethnic Chinese scientists are being unfairly targeted for scrutiny as Washington’s geopolitical competition with Beijing intensifies...

The alleged theft involves not military secrets, but scientific ideas, designs, devices, data and methods that may lead to profitable new treatments or diagnostic tools.

Some researchers under investigation have obtained patents in China on work funded by the United States government and owned by American institutions, the N.I.H. said. Others are suspected of setting up labs in China that secretly duplicated American research, according to government officials and university administrators...

The real question, [Dr. Michael Lauer, ] added, is how to preserve the open exchange of scientific ideas in the face of growing security concerns. At M.D. Anderson, administrators are tightening controls to make data less freely available."

An unseemly meeting at the US Supreme Court raises ethics questions; Quartz, November 2, 2019

Ephrat Livni, Quartz; An unseemly meeting at the US Supreme Court raises ethics questions

"“A case isn’t finished until the opinion is out,” Roth noted. So, any meeting between a justice and an advocate who has expressed positions on a matter is problematic because it undermines public trust in the judge’s ability to be fair. He calls these engagements failures of a “basic ethics test” and is concerned about how commonly these failures occur...

Roth believes that everyone, whatever their political party or ideological tendencies, should be concerned about these kinds of engagements by the justices. And he doesn’t think it’s too much to ask that members of the bench not interact with the people and institutions who’ve broadcast their views in amicus briefs while those cases are open, if only to maintain that precious appearance of neutrality."

Facebook and Twitter spread Trump’s lies, so we must break them up; The Guardian, November 3, 2019

Robert Reich, The Guardian; Facebook and Twitter spread Trump’s lies, so we must break them up 

"The reason 45% of Americans rely on Facebook for news and Trump’s tweets reach 66 million is because these platforms are near monopolies, dominating the information marketplace. No TV network, cable giant or newspaper even comes close. Fox News’ viewership rarely exceeds 3 million. The New York Times has 4.7 million subscribers.

Facebook and Twitter aren’t just participants in the information marketplace. They’re quickly becoming the information marketplace."

Amber Heard: Are We All Celebrities Now? Only a federal law can stop "revenge porn"; The New York Times, November 4, 2019

Amber Heard, The New York Times; Amber Heard: Are We All Celebrities Now? Only a federal law can stop "revenge porn"

"This is precisely why “revenge porn,” the term often used to describe this abuse, is the wrong name: It is focused on intent rather than consent. What matters is not why the perpetrator disclosed the images; it is that the victim did not consent to the disclosure.

That is why laws against nonconsensual pornography should look like laws against other privacy violations, like the laws that prohibit the unauthorized disclosure of a broad range of private information, such as medical records and Social Security numbers.

Because the patchwork of state laws fails to truly protect intimate privacy, it is vital that Congress pass legislation that does. And that is why in May, I spoke at the news conference for the introduction of the Stopping Harmful Image Exploitation and Limiting Distribution (SHIELD) Act, a bipartisan federal bill introduced by Representatives Jackie Speier of California and John Katko of New York.

Every person, from the most famous to the most obscure, from the privileged to the poor, deserves privacy."

Monday, October 28, 2019

A.I. Regulation Is Coming Soon. Here’s What the Future May Hold; Fortune, October 24, 2019

David Meyer, Fortune; A.I. Regulation Is Coming Soon. Here’s What the Future May Hold

"Last year Angela Merkel’s government tasked a new Data Ethics Commission with producing recommendations for rules around algorithms and A.I. The group’s report landed Wednesday, packed with ideas for guiding the development of this new technology in a way that protects people from exploitation.

History tells us that German ideas around data tend to make their way onto the international stage...

So, what do those recommendations look like? In a word: tough."

The biggest lie tech people tell themselves — and the rest of us; Vox, October 8, 2019

, Vox;

The biggest lie tech people tell themselves — and the rest of us

They see facial recognition, smart diapers, and surveillance devices as inevitable evolutions. They’re not.

"With great power comes great responsibility

Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.” 

And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means."

Sunday, October 27, 2019

Google Maps Just Introduced a Controversial New Feature That Drivers Will Probably Love but Police Will Utterly Hate; Inc., October 20, 2019

Bill Murphy Jr., Inc.; Google Maps Just Introduced a Controversial New Feature That Drivers Will Probably Love but Police Will Utterly Hate

"This week, however, Google announced the next best thing: Starting immediately, drivers will be able to report hazards, slowdowns, and speed traps right on Google Maps...

But one group that will likely not be happy is the police. In recent years, police have asked -- or even demanded -- that Waze drop the police-locating feature."

Thursday, October 24, 2019

‘Don’t leave campus’: Parents are now using tracking apps to watch their kids at college; The Washington Post, October 22, 2019

Abby Ohlheiser, The Washington Post; ‘Don’t leave campus’: Parents are now using tracking apps to watch their kids at college

"Many parents install tracking apps with good intentions, said Stacey Steinberg, a law professor at the University of Florida who has studied how technology impacts raising families and privacy. “We don’t want our kids to screw up,” she said. “We don’t want them to get hurt. Technology offers us new ways to feel like we are protecting them — both from others and from themselves.

“But kids need autonomy from their parents, especially when they reach adulthood,” Steinberg added. “If we want our kids to trust us, if we want our kids to believe they are capable of making wise decisions, then our actions need to show it. Valuing their privacy is one way to do so.”"

The Black-and-White World of Big Tech; The New York Times, October 24, 2019

, The New York Times; The Black-and-White World of Big Tech

Mark Zuckerberg presented us with an either-or choice of free speech — either we have it or we’re China. Tech leaders have a duty to admit it’s much more complicated.

"Mr. Zuckerberg presented us with an either-or choice of free speech — either we have free speech or we’re China. “Whether you like Facebook or not, I think we need to come together and stand for voice and free expression,” he said with an isn’t-this-obvious tone.

But, as anyone who has lived in the real world knows, it’s much more complex. And that was the main problem with his speech — and it’s also what is at the crux of the myriad concerns we have with tech these days: Big Tech’s leaders frame the debate in binary terms. 

Mr. Zuckerberg missed an opportunity to recognize that there has been a hidden and high price of all the dazzling tech of the last decade. In offering us a binary view for considering the impact of their inventions, many digital leaders avoid thinking harder about the costs of technological progress."

Wednesday, October 23, 2019

Food delivery robots from Starship Technologies are coming to Pitt’s Oakland campus; Nextpittsburgh, September 3, 2019



"Stakeholders got their first look at the project last week when the Oakland Planning and Development Corporation (OPDC) held a public meeting where Starship gave a presentation on the project.


The university has confirmed to us that Starship’s service is due to launch later this fall, but the company declined to offer further specifics about the project to NEXTpittsburgh. According to the minutes of the meeting, they plan to begin a staged rollout in mid-September. The fleet will eventually have 25 autonomous rovers carting goods (presumably to hungry students) from campus food vendors such as Forbes Street Market...


The food delivery service poses obvious practical challenges for the flow of traffic and people throughout the bustling neighborhood. According to the minutes of the public meeting, several attendees expressed concerns over the potential for traffic and bicycle accidents.


“It’ll be interesting to see how they interface with people there in the public right of ways,” says Georgia Petropoulos, executive director of the Oakland Business Improvement District, which has no formal role in the project."

A face-scanning algorithm increasingly decides whether you deserve the job; The Washington Post, October 22, 2019

Drew Harwell, The Washington Post; A face-scanning algorithm increasingly decides whether you deserve the job 

HireVue claims it uses artificial intelligence to decide who’s best for a job. Outside experts call it ‘profoundly disturbing.’

"“It’s a profoundly disturbing development that we have proprietary technology that claims to differentiate between a productive worker and a worker who isn’t fit, based on their facial movements, their tone of voice, their mannerisms,” said Meredith Whittaker, a co-founder of the AI Now Institute, a research center in New York...

Loren Larsen, HireVue’s chief technology officer, said that such criticism is uninformed and that “most AI researchers have a limited understanding” of the psychology behind how workers think and behave...

“People are rejected all the time based on how they look, their shoes, how they tucked in their shirts and how ‘hot’ they are,” he told The Washington Post. “Algorithms eliminate most of that in a way that hasn’t been possible before.”...

HireVue’s growth, however, is running into some regulatory snags. In August, Illinois Gov. J.B. Pritzker (D) signed a first-in-the-nation law that will force employers to tell job applicants how their AI-hiring system works and get their consent before running them through the test. The measure, which HireVue said it supports, will take effect Jan. 1."

Trump housing plan would make bias by algorithm 'nearly impossible to fight'; The Guardian, October 23, 2019

Kari Paul, The Guardian; Trump housing plan would make bias by algorithm 'nearly impossible to fight'

"Under the Department of Housing and Urban Development’s (HUD) new rules, businesses would be shielded from liability when their algorithms are accused of bias through three different loopholes:
  • When the algorithm in question is vetted by a “neutral third party”.
  • When the algorithm itself was created by a third party.
  • If an algorithm used did not use race or a proxy for it in the computer model.
In the letter, groups in opposition to the change noted many pieces of data can be proxies for race – discriminating by a zip code, for example, can enable a racial bias. The rule would give “unprecedented deference” to mortgage lenders, landlords, banks, insurance companies, and others in the housing industry, the letter said."