Showing posts with label privacy. Show all posts
Showing posts with label privacy. Show all posts

Friday, May 1, 2020

San Francisco recruits army of social workers, librarians and investigators to track Covid-19; The Guardian, May 1, 2020

 , The Guardian; San Francisco recruits army of social workers, librarians and investigators to track Covid-19

"San Francisco has assembled an army of librarians, social workers, attorneys, investigators and medical students to find and warn anyone and everyone who may have been exposed to Covid-19...

Immigrant communities are justifiably worried that each time they share information about their status and location, “it will come back to haunt them,” Hayes-Bautista said. “It makes sense that people are scared.”...

San Francisco has similarly publicized that the contact tracing is “voluntary, confidential, and culturally and linguistically appropriate. Immigration status will have no bearing on these conversations.”"

Thursday, April 30, 2020

Ethical practice in isolation, quarantine & contact tracing; American Medical Association (AMA), April 28, 2020

American Medical Association (AMA); Ethical practice in isolation, quarantine & contact tracing

"Contact tracing and isolation or quarantine of sick or exposed individuals are among the most effective tools to reduce transmission of infectious disease. Yet like many public health activities it raises concerns about appropriately balancing individual rights, notably privacy and confidentiality, with protecting the health of the community. The AMA Code of Medical Ethics provides guidance to help physicians strike this balance when they act in a public health capacity. 

Opinion 8.11, “Health promotion and preventive care,” provides that “physicians who work solely or primarily in a public health capacity should uphold accepted standards of medical professionalism by implementing policies that appropriately balance individual liberties with the social goals of public health policies.” That includes notifying public health authorities when physicians “notice patterns in patient health that may indicate a health risk for others.”

In keeping with Opinion 8.4, “Ethical use of quarantine & isolation,” physicians should also educate patients and the public about public health threats, potential harm to others and the benefits of quarantine and isolation, and should encourage voluntary adherence. Physicians should support mandatory measures when patients fail to adhere voluntarily."

Do I sound sick to you? Researchers are building AI that would diagnose COVID-19 by listening to people talk.; Business Insider, April 30, 2020

, Business Insider; Do I sound sick to you? Researchers are building AI that would diagnose COVID-19 by listening to people talk.

"Analyzing people's speech, coughing, and breathing patterns as a diagnostic tool isn't new — tussiphonography, or the study of cough sounds, has been around for decades. Now, AI researchers are emboldened by early reports from doctors that COVID-19 appears to have unique effects on patients' coughing and speech...

To give people an incentive to donate voice audio, Singh's lab initially published a rough AI tool online that would predict whether people have a higher chance of being COVID-19 positive using voice samples, along with a disclaimer that the tool wasn't giving real medical advice. But within 48 hours, Carnegie Mellon forced the lab to take down the online test, which could have run afoul of FDA guidelines and be misinterpreted by people regardless of the disclaimer.

"It's a perfectly valid concern, and my whole team had not thought of that ethical side of things," Singh said. "The other side is that hopefully the COVID pandemic will pass, and once it passes, hopefully it will never come back. So if we don't get the data now, we're never going to have data for research.""

Tuesday, April 28, 2020

Silicon Valley needs a new approach to studying ethics now more than ever; TechCrunch, April 24, 2020

Lisa Wehden, TechCrunch; Silicon Valley needs a new approach to studying ethics now more than ever

"These are fresh concerns in familiar debates about tech’s ethics. How should technologists think about the trade-off between the immediate need for public health surveillance and individual privacy? And misformation and free speech? Facebook and other platforms are playing a much more active role than ever in assessing the quality of information: promoting official information sources prominently and removing some posts from users defying social distancing.

As the pandemic spreads and, along with it, the race to develop new technologies accelerates, it’s more critical than ever that technology finds a way to fully examine these questions. Technologists today are ill-equipped for this challenge: striking healthy balances between competing concerns — like privacy and safety — while explaining their approach to the public...

If the only students are future technologists, though, solutions will lag. If we want a more ethically knowledgeable tech industry today, we need ethical study for tech practitioners, not just university students...

Over half of the class came from a STEM background and had missed much explicit education in ethical frameworks. Our class discussed principles from other fields, like medical ethics, including the physician’s guiding maxim (“first, do no harm”) in the context of designing new algorithms. Texts from the world of science fiction, like “The Ones Who Walk Away from Omelas” by Ursula K. Le Guin, also offered ways to grapple with issues, leading students to evaluate how to collect and use data responsibly."

Friday, April 24, 2020

Connecticut town tests 'pandemic drone' to find fevers. Experts question if it would work.; NBC News, April 22, 2020

Minyvonne Burke, NBC News; Connecticut town tests 'pandemic drone' to find fevers. Experts question if it would work.

"A Connecticut police department said it plans to begin testing a "pandemic drone" that could detect whether a person 190 feet away has a fever or is coughing.

But an expert on viruses and a privacy advocate question whether such technology can work and, if it does, whether it would help in controlling the spread of the coronavirus."

COVID-19 and the Ethical Questions It Poses; University of Nevada, Las Vegas, April 22, 2020

University of Nevada, Las Vegas; COVID-19 and the Ethical Questions It Poses

UNLV business ethics expert Wonyong Oh on the coronavirus pandemic and the ethical dilemmas facing health care workers, corporations, and government

"What are some ethical questions that businesses are wrestling with in light of COVID-19?


Let’s think about one controversial example. Real-time personal location information to track and manage the path of infection has been tried all over the world, especially actively in Asian countries like China, Korea, and Hong Kong. IT companies can track location information using smartphones to prevent virus spread. This raises ethical and legal issues surrounding access to personal information.
If you follow utilitarian ethics, tracking this kind of personal information can be allowed with the “maximum benefits for the greatest number” principle. It’s for keeping society safe from infection by sacrificing personal privacy. It seems that, recently, the views on tracking personal information in the U.S. and Europe began to change. In a few European countries, telecommunication companies began to use mobile phone data to fight COVID-19. In the U.S., Apple and Google are working together to track COVID-19 with Bluetooth. IT companies can help governments reduce the spread of the virus with their technologies. At the same time, high-tech companies need to balance that with protecting individual privacy, which is a new challenge.
Everything about the coronavirus pandemic, however, is unprecedented. The reality is that the virus threatens even ordinary freedoms, like the freedom of movement, with stay-at-home orders."

Thursday, April 23, 2020

The Ethics of NOW from Home: “Privacy versus Public Health in a Pandemic: What are the ethical tradeoffs?”; The Kenan Institute For Ethics at Duke University, April 23, 2020 at 7 PM

The Kenan Institute For Ethics at Duke University; The Ethics of NOW from Home: “Privacy versus Public Health in a Pandemic: What are the ethical tradeoffs?”

"The Ethics of Now with Adriane Lentz-Smith continues from home with a series of brief, thoughtful and timely conversations about the ethical dilemmas of this historic moment.

This week, join Professor Lentz-Smith and Washington and Lee law and cyber ethics expert, Margaret Hu for a conversation about the ethical challenges of privacy during a pandemic: “Privacy versus Public Health in a Pandemic: What are the ethical tradeoffs?” 7:00pm Thursday, April 23, 2020."

Fair and unfair algorithms: What to take into account when developing AI systems to fight COVID-19; JD Supra, April 17, 2020

Fabia Cairoli and Giangiacomo Olivi, JD Supra; Fair and unfair algorithms: What to take into account when developing AI systems to fight COVID-19

"The regulatory framework includes a number of sources from which to draw inspiration when developing AI technology. One of the most recent ones, the White Paper on Artificial Intelligence of the European Commission, is aimed at defining the risks associated with the implementation of AI systems, as well as determining the key features that should be implemented to ensure that data subjects’ rights are complied with (please see our articles The EU White Paper on Artificial Intelligence: the five requirements and Shaping EU regulations on Artificial Intelligence: the five improvements for a more detailed analysis).

It is worth noting that, particularly in relation to the development of AI technologies to fight the pandemic, the legislator is required to pay great attention to the principles and security systems. Risks associated to AI relate both to rights and technical functionalities. EU member states intending to use AI against COVID-19 will also need to ensure that any AI technology is ethical and is construed and operates in a safe way.

With regards to ethics, it is worth noting that the European Commission issued Ethics Guidelines for Trustworthy AI in April 2019. Those guidelines stressed the need for AI systems to be lawful, ethical and robust (more particularly, AI should comply with all applicable laws and regulations, as well as ensure adherence to ethical principles / values and be designed in a way that does not cause unintentional harm).

With the aim of ensuring that fundamental rights are complied with, the legislator should consider whether an AI system will maintain respect for human dignity, equality, non-discrimination and solidarity. Some of these rights may be restricted for extraordinary and overriding reasons – such as fighting against a pandemic – but this should take place under specific legal provisions and only so far as is necessary to achieve the main purpose. Indeed, the use of tracking apps and systems that profile citizens in order to determine which ones may suffer from COVID-19 entails the risk that an individual’s freedom and democratic rights could be seriously restricted."

Saturday, April 4, 2020

Using AI responsibly to fight the coronavirus pandemic; TechCrunch, April 2, 2020

Mark MinevichIrakli BeridzeTechCrunch; Using AI responsibly to fight the coronavirus pandemic

"Isolated cases or the new norm?
With the number of cases, deaths and countries on lockdown increasing at an alarming rate, we can assume that these will not be isolated examples of technological innovation in response to this global crisis. In the coming days, weeks and months of this outbreak, we will most likely see more and more AI use cases come to the fore.
While the application of AI can play an important role in seizing the reins in this crisis, and even safeguard officers and officials from infection, we must not forget that its use can raise very real and serious human rights concerns that can be damaging and undermine the trust placed in government by communities. Human rights, civil liberties and the fundamental principles of law may be exposed or damaged if we do not tread this path with great caution. There may be no turning back if Pandora’s box is opened."

Friday, April 3, 2020

Thousands of Zoom video calls left exposed on open Web; The Washington Post, April 3, 2020

 

Many of the videos include personally identifiable information and deeply intimate conversations, recorded in people’s homes.

"Thousands of personal Zoom videos have been left viewable on the open Web, highlighting the privacy risks to millions of Americans as they shift many of their personal interactions to video calls in an age of social distancing...

The discovery that the videos are available on the open Web adds to a string of Zoom privacy concerns that have come to public attention as the service became the preferred alternative for American work, school and social life.

The company reached more than 200 million daily users last month, up from 10 million in December, as people turned on their cameras for Zoom weddings, funerals and happy hours at a time when face-to-face gatherings are discouraged or banned."

Friday, February 21, 2020

Your DNA is a valuable asset, so why give it to ancestry websites for free?; The Guardian, February 16, 2020

; Your DNA is a valuable asset, so why give it to ancestry websites for free?

"The announcement by 23andMe, a company that sells home DNA testing kits, that it has sold the rights to a promising new anti-inflammatory drug to a Spanish pharmaceutical company is cause for celebration. The collected health data of 23andMe’s millions of customers have potentially produced a medical advance – the first of its kind. But a few weeks later the same company announced that it was laying off workers amid a shrinking market that its CEO put down to the public’s concerns about privacy.

These two developments are linked, because the most intimate data we can provide about ourselves – our genetic make-up – is already being harvested for ends we aren’t aware of and can’t always control. Some of them, such as better medicines, are desirable, but some of them should worry us...

These are the privacy concerns that may be behind layoffs, not only at 23andMe, but also at other DTC companies, and that we need to resolve urgently to avoid the pitfalls of genetic testingwhile [sic] realising its undoubted promise. In the meantime, we should all start reading the small print."

Thursday, January 30, 2020

Happy “Data Privacy Day” – Now Read The New York Times Privacy Project About Total Surveillance; Forbes, January 28, 2020

Steve Andriole, Forbes; Happy “Data Privacy Day” – Now Read The New York Times Privacy Project About Total Surveillance

"It’s worth saying again:  every time we blog, tweet, post, rideshare, order from Amazon, rent an Airbnb – or anything that leaves a digital trail – we feed what Shoshana Zuboff calls “surveillance capitalism,” which is the monetization of data captured through monitoring people's movements and behaviors online and in the physical world, and which is summarized in the New York Times.  Countless digital systems now track where we are, where we go, what we eat, what we think, who we like, who we love, where we bank, what we know and who we hate – among lots of other things they know all too well because we remind them over and over again.  

Just in time for Data Privacy Day, the New York Times described just how pervasive surveillance really is. On Sunday, January 26, 2020, in a special section titled “One Nation, Tracked,” the Times presented some frightening stories...

Part of the ongoing “Privacy Project,” the Times analyzes every aspect of surveillance."


Facebook pays $550m settlement for breaking Illinois data protection law; The Guardian, January 30, 2020

Alex Hern, The Guardian; Facebook pays $550m settlement for breaking Illinois data protection law

"Facebook has settled a lawsuit over facial recognition technology, agreeing to pay $550m (£419m) over accusations it had broken an Illinois state law regulating the use of biometric details...

It is one of the largest payouts for a privacy breach in US history, a marker of the strength of Illinois’s nation-leading privacy laws. The New York Times, which first reported the settlement, noted that the sum “dwarfed” the $380m penalty the credit bureau Equifax agreed to pay over a much larger customer data breach in 2017."

Tuesday, January 28, 2020

Our privacy doomsday could come sooner than we think; The Washington Post, January 23, 2020

Editorial Board, The Washington Post; Our privacy doomsday could come sooner than we think

"The case underscores with greater vigor than ever the need for restrictions on facial recognition technology. But putting limits on what the police or private businesses can do with a tool such as Clearview’s won’t stop bad actors from breaking them. There also need to be limits on whether a tool such as Clearview’s can exist in this country in the first place.

Top platforms’ policies generally prohibit the sort of data-scraping Clearview has engaged in, but it’s difficult for a company to protect information that’s on the open Web. Courts have also ruled against platforms when they have tried to go after scrapers under existing copyright or computer fraud law — and understandably, as too-onerous restrictions could hurt journalists and public-interest groups.

Privacy legislation is a more promising area for action, to prevent third parties including Clearview from assembling databases such as these in the first place, whether they’re filled with faces or location records or credit scores. That will take exactly the robust federal framework Congress has so far failed to provide, and a government that’s ready to enforce it."

Thursday, January 23, 2020

Five Ways Companies Can Adopt Ethical AI; Forbes, January 23, 2020

Kay Firth-Butterfield, Head of Artificial Intelligence and Machine Learning, World Economic Forum, Forbes; Five Ways Companies Can Adopt Ethical AI

"In 2014, Stephen Hawking said that AI would be humankind’s best or last invention. Six years later, as we welcome 2020, companies are looking at how to use Artificial Intelligence (AI) in their business to stay competitive. The question they are facing is how to evaluate whether the AI products they use will do more harm than good...

Here are five lessons for the ethical use of AI."

Tuesday, January 7, 2020

UK Government Plans To Open Public Transport Data To Third Parties; Forbes, December 31, 2019

Simon Chandler, Forbes; UK Government Plans To Open Public Transport Data To Third Parties

"The launch is a significant victory for big data. Occasionally derided as a faddish megatrend or empty buzzword, the announcement of the Bus Open Data Service shows that national governments are willing to harness masses of data and use them to create new services and economic opportunities. Similarly, it's also a victory for the internet of things, insofar as real-time data from buses will be involved in providing users with up-to-date travel info.

That said, the involvement of big data inevitably invites fears surrounding privacy and surveillance."

Wednesday, December 18, 2019

A Surveillance Net Blankets China’s Cities, Giving Police Vast Powers; The New York Times, December 17, 2019

Paul Mozur and ; A Surveillance Net Blankets China’s Cities, Giving Police Vast Powers

""China is ramping up its ability to spy on its nearly 1.4 billion people to new and disturbing levels, giving the world a blueprint for how to build a digital totalitarian state.

 Chinese authorities are knitting together old and state-of-the-art technologies — phone scanners, facial-recognition cameras, face and fingerprint databases and many others — into sweeping tools for authoritarian control, according to police and private databases examined by The New York Times."

Thursday, November 21, 2019

Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information; Pew Research Center, November 15, 2019

Brooke Auxier, Lee Rainie, Monica Anderson, Andrew Perrin, Madhu Kumar, and Erica Turner, Pew Research Center;

Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information

Majorities think their personal data is less secure now, that data collection poses more risks than benefits, and believe it is not possible to go through daily life without being tracked


"Data-driven products and services are often marketed with the potential to save users time and money or even lead to better health and well-being. Still, large shares of U.S. adults are not convinced they benefit from this system of widespread data gathering. Some 81% of the public say that the potential risks they face because of data collection by companies outweigh the benefits, and 66% say the same about government data collection. At the same time, a majority of Americans report being concerned about the way their data is being used by companies (79%) or the government (64%). Most also feel they have little or no control over how these entities use their personal information, according to a new survey of U.S. adults by Pew Research Center that explores how Americans feel about the state of privacy in the nation.

Americans’ concerns about digital privacy extend to those who collect, store and use their personal information. Additionally, majorities of the public are not confident that corporations are good stewards of the data they collect. For example, 79% of Americans say they are not too or not at all confident that companies will admit mistakes and take responsibility if they misuse or compromise personal information, and 69% report having this same lack of confidence that firms will use their personal information in ways they will be comfortable with."

Consumer DNA Testing May Be the Biggest Health Scam of the Decade; Gizmodo, November 20, 2019

Ed Cara, Gizmodo; Consumer DNA Testing May Be the Biggest Health Scam of the Decade

"This test, as well as many of those offered by the hundreds of big and small DNA testing companies on the market, illustrates the uncertainty of personalized consumer genetics.

The bet that companies like 23andMe are making is that they can untangle this mess and translate their results back to people in a way that won’t cross the line into deceptive marketing while still convincing their customers they truly matter. Other companies have teamed up with outside labs and doctors to look over customers’ genes and have hired genetic counselors to go over their results, which might place them on safer legal and medical ground. But it still raises the question of whether people will benefit from the information they get. And because our knowledge of the relationship between genes and health is constantly changing, it’s very much possible the DNA test you take in 2020 will tell you a totally different story by 2030."