Showing posts with label employers. Show all posts
Showing posts with label employers. Show all posts

Friday, May 20, 2022

Federal officials caution employers on using AI in hiring; FCW, May 12, 2022

Natalie Alms, FCW; Federal officials caution employers on using AI in hiring

"The growing use of artificial intelligence and other software tools for hiring, performance monitoring and pay determination in the workplace is compounding discriminiation against people with disabilities, federal civil rights officials say.

Artificial intelligence can be deployed to target job ads to certain potential applicants, hold online job interviews, assess the skills of job applicants and even decide if an applicant meets job requirements. But the technology can discriminate against applicants and employees with disabilities.

On Thursday, the Equal Employment Opportunity Commission and the Department of Justice put employers on alert that they're responsible for not using AI tools in ways that discriminate and inform employees of their rights, agency officials told reporters."

Friday, February 11, 2022

Congress approves bill to end forced arbitration in sexual assault cases; NPR, February 10, 2022

Deirdre Walsh, NPR ; Congress approves bill to end forced arbitration in sexual assault cases

"The Senate approved legislation banning the practice of using clauses in employment contracts that force victims of sexual assault and harassment to pursue their cases in forced arbitration, which shields accused perpetrators.

Sen. Kirstin Gillibrand, D-N.Y., and Sen. Lindsey Graham, R-S.C., introduced the bill five years ago and lawmakers negotiated with business leaders to get support for the bill. In a sign of the overwhelming support for the measure, it was approved by voice vote in the chamber.

The bill, called the Ending Forced Arbitration of Sexual Assault and Sexual Harassment Act, was passed with a large bipartisan vote by the House of Representatives on Monday and heads to President Biden's desk for his signature.

The bill gives individuals a choice between going to court or going to arbitration to resolve allegations in cases related to sexual harassment or assault. The measure is also retroactive — invalidating any existing forced arbitration clauses in ongoing cases that could make it difficult for any survivors to litigate cases against their employers.""

Friday, February 4, 2022

Where Automated Job Interviews Fall Short; Harvard Business Review (HBR), January 27, 2022

Dimitra Petrakaki, Rachel Starr, and , Harvard Business Review (HBR) ; Where Automated Job Interviews Fall Short

"The use of artificial intelligence in HR processes is a new, and likely unstoppable, trend. In recruitment, up to 86% of employers use job interviews mediated by technology, a growing portion of which are automated video interviews (AVIs).

AVIs involve job candidates being interviewed by an artificial intelligence, which requires them to record themselves on an interview platform, answering questions under time pressure. The video is then submitted through the AI developer platform, which processes the data of the candidate — this can be visual (e.g. smiles), verbal (e.g. key words used), and/or vocal (e.g. the tone of voice). In some cases, the platform then passes a report with an interpretation of the job candidate’s performance to the employer.

The technologies used for these videos present issues in reliably capturing a candidate’s characteristics. There is also strong evidence that these technologies can contain bias that can exclude some categories of job-seekers. The Berkeley Haas Center for Equity, Gender, and Leadership reports that 44% of AI systems are embedded with gender bias, with about 26% displaying both gender and race bias. For example, facial recognition algorithms have a 35% higher detection error for recognizing the gender of women of color, compared to men with lighter skin.

But as developers work to remove biases and increase reliability, we still know very little on how AVIs (or other types of interviews involving artificial intelligence) are experienced by different categories of job candidates themselves, and how these experiences affect them, this is where our research focused. Without this knowledge, employers and managers can’t fully understand the impact these technologies are having on their talent pool or on different group of workers (e.g., age, ethnicity, and social background). As a result, organizations are ill-equipped to discern whether the platforms they turn to are truly helping them hire candidates that align with their goals. We seek to explore whether employers are alienating promising candidates — and potentially entire categories of job seekers by default — because of varying experiences of the technology."

Saturday, June 13, 2020

What Do I Do if My Employer Does Something I Can’t Abide?; The New York Times, June 12, 2020

, The New York Times; What Do I Do if My Employer Does Something I Can’t Abide?

You have to calibrate the difference between dumb and unacceptable, what you can live with and what you cannot.

"You have to pick your battles. You have to calibrate the difference between stupid and unacceptable, what you can live with and what you cannot. Because you work for a newspaper that will always publish a range of content, some of which you agree with and some of which you do not, you also have to calibrate the difference between disagreement and disgust.

That’s the tidy answer that doesn’t really force you to make the difficult decision. But now, more than ever, with so much at stake, we have to be willing to make difficult decisions. We have to be willing to make ourselves uncomfortable in service of what’s right. When the Minneapolis police officer Derek Chauvin kept his knee on George Floyd’s neck for nearly nine minutes, three of his co-workers stood by and did nothing. When a police officer in Buffalo shoved a 75-year-old man to the ground, dozens of his co-workers walked past that fallen man, bleeding from his ear. They did nothing.

Most situations in which you object to your employer’s conduct won’t be so extreme. But something terrible happened in this country, something that has happened with horrifying frequency. Each time we think maybe this time, something will change. For a few days or even a few weeks, change seems possible — and then we all get comfortable again. We forget about whatever terrible thing once held our attention. A new terrible thing happens. We get outraged. It’s a vicious cycle, but it is one we can break. When your employer does something that violates your ethical code, when it does something that endangers employees or the greater community, you have to ask yourself if you are going to do nothing — or get angry, vent and hold your employer accountable in whatever ways you can. I am, perhaps, simplifying the choices you can make, but maybe doing the right thing is far simpler than we allow ourselves to believe."

Tuesday, January 14, 2020

‘The Algorithm Made Me Do It’: Artificial Intelligence Ethics Is Still On Shaky Ground; Forbes, December 22, 2019

Joe McKendrick, Forbes; ‘The Algorithm Made Me Do It’: Artificial Intelligence Ethics Is Still On Shaky Ground

"While artificial intelligence is the trend du jour across enterprises of all types, there’s still scant attention being paid to its ethical ramifications. Perhaps it’s time for people to step up and ask the hard questions. For enterprises, it’s time to bring together — or recruit — people who can ask the hard questions.

In one recent survey by Genesys, 54% of employers questioned say they are not troubled that AI could be used unethically by their companies as a whole or by individual employees. “Employees appear more relaxed than their bosses, with only 17% expressing concern about their companies,” the survey’s authors add...

Sandler and his co-authors focus on the importance of their final point, urging that organizations establish an AI ethics committee, comprised of stakeholders from across the enterprise — technical, legal, ethical, and organizational. This is still unexplored territory, they caution: “There are not yet data and AI ethics committees with established records of being effective and well-functioning, so there are no success models to serve as case-studies or best practices for how to design and implement them.”"

Monday, October 28, 2019

The biggest lie tech people tell themselves — and the rest of us; Vox, October 8, 2019

, Vox;

The biggest lie tech people tell themselves — and the rest of us

They see facial recognition, smart diapers, and surveillance devices as inevitable evolutions. They’re not.

"With great power comes great responsibility

Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.” 

And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means."

Thursday, April 25, 2019

The Legal and Ethical Implications of Using AI in Hiring; Harvard Business Review, April 25, 2019

  • Ben Dattner
  • Tomas Chamorro-Premuzic
  • Richard Buchband
  • Lucinda Schettler
  • , Harvard Business Review; 

    The Legal and Ethical Implications of Using AI in Hiring


    "Using AI, big data, social media, and machine learning, employers will have ever-greater access to candidates’ private lives, private attributes, and private challenges and states of mind. There are no easy answers to many of the new questions about privacy we have raised here, but we believe that they are all worthy of public discussion and debate."

    Sunday, February 17, 2019

    With fitness trackers in the workplace, bosses can monitor your every step — and possibly more; The Washington Post, February 16, 2019

    Christopher Rowland, The Washington Post; With fitness trackers in the workplace, bosses can monitor your every step — and possibly more



    [Kip Currier: This article--and case study about the upshots and downsides of employers' use of personal health data harvested from their employees' wearable devices--is a veritable "ripped from the headlines" gift from the Gods for an Information Ethics professor's discussion question for students this week!... 
    What are the ethics issues? 
    Who are the stakeholders? 
    What ethical theory/theories would you apply/not apply in your analysis and decision-making?
    What are the risks and benefits presented by the issues and the technology? 
    What are the potential positive and negative consequences?  
    What are the relevant laws and gaps in law?
    Would you decide to participate in a health data program, like the one examined in the article? Why or why not?

    And for all of us...spread the word that HIPAA does NOT cover personal health information that employees VOLUNTARILY give to employers. It's ultimately your decision to decide what to do, but we all need to be aware of the pertinent facts, so we can make the most informed decisions.
    See the full article and the excerpt below...]   


    "Many consumers are under the mistaken belief that all health data they share is required by law to be kept private under a federal law called HIPAA, the Health Insurance Portability and Accountability Act. The law prohibits doctors, hospitals and insurance companies from disclosing personal health information.


    But if an employee voluntarily gives health data to an employer or a company such as Fitbit or Apple — entities that are not covered by HIPPA’s [sic] rules — those restrictions on disclosure don’t apply, said Joe Jerome, a policy lawyer at the Center for Democracy & Technology, a nonprofit in Washington. The center is urging federal policymakers to tighten up the rules.

    “There’s gaps everywhere,’’ Jerome said.

    Real-time information from wearable devices is crunched together with information about past doctors visits and hospitalizations to get a health snapshot of employees...

    Some companies also add information from outside the health system — social predictors of health such as credit scores and whether someone lives alone — to come up with individual risk forecasts."

    Thursday, January 10, 2019

    Pennsylvania High Court Decision Regarding Data Breach Increases Litigation Risk for Companies Storing Personal Data; Lexology, January 8, 2019

    Ropes & Gray LLP , Lexology; Pennsylvania High Court Decision Regarding Data Breach Increases Litigation Risk for Companies Storing Personal Data

    "This decision could precipitate increased data breach class action litigation against companies that retain personal data. No state Supreme Court had previously recognized the existence of a negligence-based duty to safeguard personal information, other than in the narrow context of health care patient information."

    Wednesday, November 28, 2018

    Pennsylvania High Court Finds Duty to Safeguard Employee Information; Lexology, November 26, 2018

    Patterson Belknap Webb & Tyler LLP, Lexology; Pennsylvania High Court Finds Duty to Safeguard Employee Information

    "The Pennsylvania Supreme Court handed the state’s employees a major legal victory last week when it decided that employers have an affirmative legal responsibility to protect the confidential information of its employees...

    In reversing two lower courts, the justices ruled that, by collecting and storing employee’s personal information as a pre-condition to employment, employers had the legal duty to take reasonable steps to protect that information from a cyber-attack...

    The ruling revives a proposed class action lawsuit against the University of Pittsburgh Medical Center and one of its hospitals, UPMC McKeesport, after a 2014 data breach in which hackers allegedly stole the personal information of 62,000 former and current employees...

    Whether the ruling is viewed narrowly as confined to its facts, or more broadly as establishing a general legal duty to safeguard confidential information, there is little question that the decision marks an important development in tort law governing data breach cases...

    The case is Dittman et al. v. UPMC, Case No. 43 WAP 2017."

    Sunday, March 12, 2017

    Employees who decline genetic testing could face penalties under proposed bill; Washington Post, March 11, 2017

    Lena H. Sun, Washington Post; Employees who decline genetic testing could face penalties under proposed bill

    "Employers could impose hefty penalties on employees who decline to participate in genetic testing as part of workplace wellness programs if a bill approved by a U.S. House committee this week becomes law.

    In general, employers don't have that power under existing federal laws, which protect genetic privacy and nondiscrimination. But a bill passed Wednesday by the House Committee on Education and the Workforce would allow employers to get around those obstacles if the information is collected as part of a workplace wellness program."

    Tuesday, January 22, 2013

    Daily Report: Even if It Outrages the Boss, Social Net Speech Is Protected; New York Times, 1/22/13

    New York Times; Daily Report: Even if It Outrages the Boss, Social Net Speech Is Protected: "Employers often seek to discourage comments that paint them in a negative light. Don’t discuss company matters publicly, a typical social media policy will say, and don’t disparage managers, co-workers or the company itself. Violations can be a firing offense. But in a series of recent rulings and advisories, labor regulators have declared many such blanket restrictions illegal. The National Labor Relations Board says workers have a right to discuss work conditions freely and without fear of retribution, whether the discussion takes place at the office or on Facebook."