Showing posts with label unconscious bias. Show all posts
Showing posts with label unconscious bias. Show all posts

Wednesday, July 16, 2025

The D.E.I. Industry, Scorned by the White House, Turns to ‘Safer’ Topics; The New York Times, July 15, 2025

 , The New York Times; The D.E.I. Industry, Scorned by the White House, Turns to ‘Safer’ Topics

"When President Trump signed an executive order in January targeting diversity, equity and inclusion programs in federal agencies, schools and the private sector, Arin Reeves, who has been a D.E.I. consultant for 26 years, said many in her field were in a panic.

“All the federal government stuff, I was watching it, and I genuinely didn’t even know where to go with it,” Ms. Reeves said. For those in the industry, she added, there was a feeling of: “What do we do?”

The answer for many D.E.I. professionals has been to adapt to what companies feel comfortable offering: employee trainings that maintain the principles of diversity and inclusion but without necessarily calling them that. That has meant fewer sessions that focus explicitly on race, gender, sexuality and unconscious bias, and more on subjects like neurodivergence, mental health and generational differences, a training that teaches about how age affects viewpoints in the workplace."

Friday, February 16, 2018

Congress is worried about AI bias and diversity; Quartz, February 15, 2018

Dave Gershgorn, Quartz; Congress is worried about AI bias and diversity

"Recent research from the MIT Media Lab maintains that facial recognition is still significantly worse for people of color, however.
“This is not a small thing,” Isbell said of his experience. “It can be quite subtle, and you can go years and years and decades without even understanding you are injecting these kinds of biases, just in the questions that you’re asking, the data you’re given, and the problems you’re trying to solve.”
In his opening statement, Isbell talked about biased data in artificial intelligence systems today, including predictive policing and biased algorithms used in predicting recidivism rates.
“It does not take much imagination to see how being from a heavily policed area raises the chances of being arrested again, being convicted again, and in aggregate leads to even more policing of the same areas, creating a feedback loop,” he said. “One can imagine similar issues with determining it for a job, or credit-worthiness, or even face recognition and automated driving.”"