Showing posts with label data scientists. Show all posts
Showing posts with label data scientists. Show all posts

Tuesday, September 17, 2024

How Elon Musk Destroyed Twitter; Fresh Air, NPR, September 11, 2024

 Fresh Air, NPR; How Elon Musk Destroyed Twitter

"After buying Twitter in 2022, Elon Musk instituted sweeping changes. He laid off or fired about 75% of the staff –including about half the data scientists. He also ended rules banning hate speech and misinformation. Authors Kate Conger and Ryan Mac recount the takeover in Character Limit."

Thursday, May 16, 2024

How to Implement AI — Responsibly; Harvard Business Review (HBR), May 10, 2024

and , Harvard Business Review (HBR) ; How to Implement AI — Responsibly

"Regrettably, our research suggests that such proactive measures are the exception rather than the rule. While AI ethics is high on the agenda for many organizations, translating AI principles into practices and behaviors is proving easier said than done. However, with stiff financial penalties at stake for noncompliance, there’s little time to waste. What should leaders do to double-down on their responsible AI initiatives?

To find answers, we engaged with organizations across a variety of industries, each at a different stage of implementing responsible AI. While data engineers and data scientists typically take on most responsibility from conception to production of AI development lifecycles, nontechnical leaders can play a key role in ensuring the integration of responsible AI. We identified four key moves — translate, integrate, calibrate and proliferate — that leaders can make to ensure that responsible AI practices are fully integrated into broader operational standards."

Thursday, June 22, 2023

Data Ethics: 9 Codes of Conduct Every Data Scientist Should Follow; Make Use Of, June 8, 2023

JOSHUA ADEGOKE, Make Use Of; Data Ethics: 9 Codes of Conduct Every Data Scientist Should Follow

"You Must be Ethical as a Data Scientist 

As a data scientist, you receive a power that comes with proportional responsibility. Your skills are rare, so you sit at the forefront of organizational decision-making.

Your decisions affect everything from company business plans to criminal justice systems. So, you shouldn’t make them lightly. Always be honest, ethical, and meticulous in your work to protect people from existing ethical dilemmas across your industry and other tech fields."

Sunday, June 11, 2023

Teaching the ethics of data science through immersive video; Cornell Chronicle, June 6, 2023

 By , Cornell Chronicle; Teaching the ethics of data science through immersive video

"In “Nobody’s Fault,” students experience what it’s like to be a data scientist dealing with a moral conflict. The video stops from time to time, asking viewers how they would handle the tricky situations being depicted. As they make decisions, the plot shifts, and they see the consequences unfold – and how they affect an unemployed woman who can’t get the facial recognition application to work.

After a series of unhappy outcomes, the scene rewinds, better choices are offered, and students see how things could have been different for the woman seeking her benefits.

“The video gave us real-world experience with ethical dilemmas,” said Britt Snider, M.I.L.R. ’24. “It enhanced our learning of the subject by showing us in real time the consequences of our decisions – and how something as seemingly innocuous as a few percentage points could cause such a large consequence to society overall.”"

Thursday, October 11, 2018

Do We Need To Teach Ethics And Empathy To Data Scientists?; Forbes, October 8, 2018

Kalev Leetaru, Forbes; Do We Need To Teach Ethics And Empathy To Data Scientists?

[Kip Currier: A thought-provoking and timely piece, especially as I'm presently writing a chapter on research ethics for my ethics textbook and was just reviewing and thinking about the history of informed consent and Institutional Review Boards-cum-Human-Research-Protection-Offices. Medical ethics lapses like those involving Henrietta Lacks and the Tuskegee Syphilis Study are potent reminders of the concomitant imperative for ethics oversight and informed consent vis-a-vis digital age research.]

"The growing shift away from ethics and empathy in the creation of our digital future is both profoundly frightening for the Orwellian world it is ushering in, but also a sad commentary on the academic world that trains the data scientists and programmers that are shifting the online world away from privacy. How might the web change if we taught ethics and empathy as primary components of computer science curriculums?

One of the most frightening aspects of the modern web is the speed at which it has struck down decades of legislation and professional norms regarding personal privacy and the ethics of turning ordinary citizens into laboratory rats to be experimented on against their wills. In the space of just two decades the online world has weaponized personalization and data brokering, stripped away the last vestiges of privacy, centralized control over the world’s information and communications channels, changed the public’s understanding of the right over their digital selves and profoundly reshaped how the scholarly world views research ethics, informed consent and the right to opt out of being turned into a digital guinea pig.

It is the latter which in many ways has driven each of the former changes. Academia’s changing views towards IRB and ethical review has produced a new generation of programmers and data scientists who view research ethics as merely an outdated obsolete historical relic that was an obnoxious barrier preventing them from doing as they pleased to an unsuspecting public."

Thursday, October 4, 2018

Data Science Institute prepares students for ethical decision-making; The Cavalier Daily (University of Virginia), October 4, 2018

Zoe Ziff, The Cavalier Daily (University of Virginia); Data Science Institute prepares students for ethical decision-making

"The University's Data Science Institute recently incorporated the new Center for Data Ethics and Justice — founded by the University’s Bioethics Chair Jarrett Zigon — in an effort to ramp up its focus on ethics in analysis and interpretation of data. This partnership has created a new course for graduate data science students that specifically addresses ethical issues related to the handling of data and advancement in technology. 

The DSI — located in Dell 1 and Dell 2 — is a research and academic institute that offers masters programs in data science as well as dual degrees in partnership with the Darden School of Business, the Medical School and the Nursing School. 

Phillip Bourne — director of the DSI and professor of biomedical engineering — regards ethics as a pillar of their graduate program. He said few data scientists have formal training in ethics, and the partnership with the Center will equip students with the tools to make ethical decisions throughout their careers. 

The Center brings a redefined course to the Master’s of Science in Data Science that is specifically designed for tackling ethical problems in the data science field."

Sunday, June 10, 2018

How data scientists are using AI for suicide prevention; Vox, June 9, 2018

Brian Resnick, Vox; How data scientists are using AI for suicide prevention

"At the Crisis Text Line, a text messaging-based crisis counseling hotline, these deluges have the potential to overwhelm the human staff.

So data scientists at Crisis Text Line are using machine learning, a type of artificial intelligence, to pull out the words and emojis that can signal a person at higher risk of suicide ideation or self-harm. The computer tells them who on hold needs to jump to the front of the line to be helped.

They can do this because Crisis Text Line does something radical for a crisis counseling service: It collects a massive amount of data on the 30 million texts it has exchanged with users. While Netflix and Amazon are collecting data on tastes and shopping habits, the Crisis Text Line is collecting data on despair.

The data, some of which is available here, has turned up all kinds of interesting insights on mental health."

Tuesday, January 10, 2017

Your private medical data is for sale – and it's driving a business worth billions; Guardian, 1/10/17

Sam Thielman, Guardian; 

Your private medical data is for sale – and it's driving a business worth billions:

"Your medical data is for sale – all of it. Adam Tanner, a fellow at Harvard’s institute for quantitative social science and author of a new book on the topic, Our Bodies, Our Data, said that patients generally don’t know that their most personal information – what diseases they test positive for, what surgeries they have had – is the stuff of multibillion-dollar business.

But although the data is nominally stripped of personally identifying information, data miners and brokers are working tirelessly to aggregate detailed dossiers on individual patients; the patients are merely called “24601” instead of “Jean Valjean”."

Friday, February 5, 2016

Big Data Ethics: racially biased training data versus machine learning; BoingBoing.net, 2/5/16

Cory Doctorow, BoingBoing.net; Big Data Ethics: racially biased training data versus machine learning:
"Writing in Slate, Cathy "Weapons of Math Destruction" O'Neill, a skeptical data-scientist, describes the ways that Big Data intersects with ethical considerations.
O'Neill recounts an exercise to improve service to homeless families in New York City, in which data-analysis was used to identify risk-factors for long-term homelessness. The problem, O'Neill describes, was that many of the factors in the existing data on homelessness were entangled with things like race (and its proxies, like ZIP codes, which map extensively to race in heavily segregated cities like New York). Using data that reflects racism in the system to train a machine-learning algorithm whose conclusions can't be readily understood runs the risk of embedding that racism in a new set of policies, these ones scrubbed clean of the appearance of bias with the application of objective-seeming mathematics.
We talk a lot about algorithms in the context of Big Data but the algorithms themselves are well-understood and pretty universal -- they're the same ones that are used in mass-surveillance and serving ads. But the training data is subject to the same problems experienced by all sciences when they try to get a good, random sampling to use in their analysis. Just like bad sampling can blow up a medical trial or a psych experiment, it can also confound big data. Rather than calling for algorithmic transparency, we need to call for data transparency, methodological transparency, and sampling transparency."

Wednesday, September 17, 2014

Data Scientists Want Big Data Ethics Standards; Information Week, 9/17/14

Jeff Bertolucci, Information Week; Data Scientists Want Big Data Ethics Standards:
"The vast majority of statisticians and data scientists believe that consumers should worry about privacy issues related to data being collected on them, and most have qualms about the questionable ethics behind Facebook's undisclosed psychological experiment on its users in 2012.
Those are just two of the findings from a Revolution Analytics survey of 144 data scientists at JSM (Joint Statistical Meetings) 2014, an annual gathering of statisticians, to gauge their thoughts on big data ethics. The Boston conference ran Aug. 2-7."