Showing posts with label consent. Show all posts
Showing posts with label consent. Show all posts

Saturday, July 6, 2024

THE GREAT SCRAPE: THE CLASH BETWEEN SCRAPING AND PRIVACY; SSRN, July 3, 2024

Daniel J. SoloveGeorge Washington University Law School; Woodrow HartzogBoston University School of Law; Stanford Law School Center for Internet and SocietyTHE GREAT SCRAPETHE CLASH BETWEEN SCRAPING AND PRIVACY

"ABSTRACT

Artificial intelligence (AI) systems depend on massive quantities of data, often gathered by “scraping” – the automated extraction of large amounts of data from the internet. A great deal of scraped data is about people. This personal data provides the grist for AI tools such as facial recognition, deep fakes, and generative AI. Although scraping enables web searching, archival, and meaningful scientific research, scraping for AI can also be objectionable or even harmful to individuals and society.


Organizations are scraping at an escalating pace and scale, even though many privacy laws are seemingly incongruous with the practice. In this Article, we contend that scraping must undergo a serious reckoning with privacy law. Scraping violates nearly all of the key principles in privacy laws, including fairness; individual rights and control; transparency; consent; purpose specification and secondary use restrictions; data minimization; onward transfer; and data security. With scraping, data protection laws built around

these requirements are ignored.


Scraping has evaded a reckoning with privacy law largely because scrapers act as if all publicly available data were free for the taking. But the public availability of scraped data shouldn’t give scrapers a free pass. Privacy law regularly protects publicly available data, and privacy principles are implicated even when personal data is accessible to others.


This Article explores the fundamental tension between scraping and privacy law. With the zealous pursuit and astronomical growth of AI, we are in the midst of what we call the “great scrape.” There must now be a great reconciliation."

Friday, June 7, 2024

Angry Instagram posts won’t stop Meta AI from using your content; Popular Science, June 5, 2024

 Mack DeGeurin, Popular Science; Angry Instagram posts won’t stop Meta AI from using your content

"Meta, the Mark Zuckerberg-owned tech giant behind Instagram, surprised many of the app’s estimated 1.2 billion global users with a shock revelation last month. Images, including original artwork and other creative assets uploaded to the company’s platforms, are now being used to train the company’s AI image generator. That admission, initially made public by Meta executive Chris Cox during an interview with Bloomberg last month, has elicited a fierce backlash from some creators. As of writing, more than 130,000 Instagram users have reshared a message on Instagram telling the company they do not consent to it using their data to train Meta AI. Those pleas, however, are founded on a fundamental misunderstanding of creators’ relationship with extractive social media platforms. These creators already gave away their work, whether they realize it or not."

Monday, July 31, 2023

A museum’s historic human remains are now the center of an ethics clash; The Washington Post, July 27, 2023

 , The Washington Post; A museum’s historic human remains are now the center of an ethics clash

"The Mütter is a place for people who don’t fit in. And now, Eisenstein fears he can’t fit in here, either.

“People who have always felt othered” — for their physical abilities, their sexuality, their neurodivergence, their interest in death — “find their home in the museum,” says Polasky, the petitioner, who is now a curator for the British Online Archives.

Whether they can continue to do so depends on the answer to one question: What happens when the ethics of the 19th century meet those of the 21st?"

Sunday, December 30, 2018

Defending ‘Needles in the Sewer’ and Photographing the Disadvantaged; PetaPixel, December 29, 2018

Simon King, PetaPixel; Defending ‘Needles in the Sewer’ and Photographing the Disadvantaged

[Kip Currier: Thought-provoking article identifying and discussing some of the sticky ethical issues of whether to-photograph or not-to-photograph, particularly regarding vulnerable populations and difficult topics. Kudos to the photographer Simon King for shedding light on his metacognition (i.e. thinking about thinking), with regard to how and when he takes pictures and what he does and does not do with them.

Beyond photography, the issues raised in the piece have broader implications as well for digital age technologies' impacts on disadvantaged communities related to the increasing collection and use of data generated by AI algorithms, mass surveillance, facial recognition, biometric information, etc. The last two paragraphs of a November 2018 New York Times article, Colleges Grapple With Teaching the Technology and Ethics of A.I., provide an example of some of the ways higher education is preparing students to better recognize and address these issues:

David Danks, a professor of philosophy and psychology at Carnegie Mellon, just started teaching a class, “A.I, Society and Humanity.” The class is an outgrowth of faculty coming together over the past three years to create shared research projects, he said, because students need to learn from both those who are trained in the technology and those who are trained in asking ethical questions.

“The key is to make sure they have the opportunities to really explore the ways technology can have an impact — to think how this will affect people in poorer communities or how it can be abused,” he said.]



"The main issues people brought up about this image were consent and exploitation...

My responsibility (and maybe yours?) as a photographer is to avoid self-censorship. I can always choose to publish an image or not, but only if that image exists in the first place. If I take an image then I should have the presence of mind to understand what I saw in that scene, and what purpose I want to apply to that image. If I had not taken an image at this time would that be a form of erasing and ignoring this issue? I would rather face discussion and debate about my work than to talk as if these issues are distant and abstract.

Thanks for taking the time to read this. I’d like to direct some of the attention from this topic and image to the website Addaction. It’s a UK-based organization providing aid and outreach to at-risk addicts. Please consider having a look at their website and possibly making a donation, or maybe going out of your way to produce an image that may also draw attention to this topic."

Saturday, November 10, 2018

Our lack of interest in data ethics will come back to haunt us; TNW, November 3, 2018

Jayson Demers, TNW; Our lack of interest in data ethics will come back to haunt us

"Outreach and attention

We can’t solve these ethical dilemmas by issuing judgments or making a few laws. After all, ethical discussions rarely result in a simple understanding of what’s “right” and what’s “wrong.” Instead, we should be concentrating our efforts on raising awareness of these ethical dilemmas, and facilitating more open, progressive conversations.

We need to democratize the conversation by encouraging consumers to demand greater ownership, control, and/or transparency over their own data. We need to hold companies accountable for their practices before they get out of hand. And we need the data scientists, entrepreneurs, and marketers of the world to think seriously about the consequences of their data-related efforts — and avoid sacrificing ethical considerations in the name of profits."

Wednesday, June 6, 2018

When Scientists Develop Products From Personal Medical Data, Who Gets To Profit?; NPR, May 31, 2018

Richard Harris, NPR; When Scientists Develop Products From Personal Medical Data, Who Gets To Profit?

"If you go to the hospital for medical treatment and scientists there decide to use your medical information to create a commercial product, are you owed anything as part of the bargain?

That's one of the questions that is emerging as researchers and product developers eagerly delve into digital data such as CT scans and electronic medical records, making artificial-intelligence products that are helping doctors to manage information and even to help them diagnose disease.

This issue cropped up in 2016, when Google DeepMind decided to test an app that measures kidney health by gathering 1.6 million records from patients at the Royal Free Hospital in London. The British authorities found this broke patient privacy laws in the United Kingdom. (Update on June 1 at 9:30 a.m. ET: DeepMind says it was able to deploy its app despite the violation.)

But the rules are different in the United States."

Wednesday, April 4, 2018

The Guardian view on Grindr and data protection: don’t trade our privacy; Guardian, April 3, 2018

Editorial, Guardian; The Guardian view on Grindr and data protection: don’t trade our privacy

"Whether the users were at fault for excessive trust, or lack of imagination, or even whether they were at fault at all for submitting information that would let their potential partners make a better informed choice, as liberal ethics would demand, the next thing to scrutinise is the role of the company itself. Grindr has now said that it will no longer hand over the information, which is an admission that it was wrong to do so in the first place. It also says that the information was always anonymised, and that its policy was perfectly standard practice among digital businesses. This last is perfectly true, and perhaps the most worrying part of the whole story.

We now live in a world where the valuations of giant companies are determined by the amount of personal data they hold on third parties, who frequently have no idea how much there is, nor how revealing it is. As well as the HIV status, and last test date, Grindr collected and passed on to third parties its users’ locations, their phone identification numbers, and emails. These went to two companies that promise to make it easier to deliver personalised advertisements to phones based on the users’ locations and to increase the amount of time they spend looking at apps on their phones. The data was in theory anonymised, although repeated experiments have shown that the anonymity of personal information on the internet is pretty easily cracked in most cases."

Tuesday, July 25, 2017

Report: Roomba Could Sell Maps of Your Home to Tech Giants; Daily Beast, July 24, 2017

Daily Beast; Report: Roomba Could Sell Maps of Your Home to Tech Giants

"Roomba, a popular brand of robotic vacuum, can make maps of homes it cleans, Reuters reports. And Roomba’s parent company, iRobot, is reportedly considering a sale to tech giants like Amazon, Apple, or Alphabet, which could buy maps of Roomba-owning homes. The data would be used in smart home technology but could also raise privacy concerns for Roomba owners who do not want their data sold. iRobot’s CEO told Reuters that it would not sell customers’ data without their consent."

Monday, June 5, 2017

IoT Devices Becoming More Important in Criminal Investigations; Inside Counsel, June 1, 2017

Amanda Ciccatelli, Inside Counsel; IoT Devices Becoming More Important in Criminal Investigations

"In addition, an area of the law which will evolve because of IoT being utilized in court is privacy law. Fitbit's privacy policies clearly state that they will cooperate with a legal subpoena or warrant. Moreover, they outline that user’s information will be stored unless the account is completely closed, and even then, the information will only be destroyed per the company's regular maintenance schedule. Accordingly, users have consented to this application.

She explained, “Users of IoT need be cognizant of the fact that these very personal devices, worn by us every minute of the day or listening in our homes, come at a very real privacy cost…If IoT is in use, users must balance the risk that their data will be used in court."

Friday, May 19, 2017

Americans Want More Say in the Privacy of Personal Data; Consumer Reports, May 18, 2017

Bree Fowler, Consumer Reports; Americans Want More Say in the Privacy of Personal Data

[Kip Currier: Take a look at Consumer Reports' latest survey data on U.S. consumers' concerns about privacy and their personal data: significant majorities want more control over what data is collected and more transparency (not less!) regarding what Internet service providers can and can't do with that personal data.

Then consider this May 18, 2017 disconnect: "The Federal Communications Commission (FCC), led by chairman Ajit Pai, voted two to one to start the formal process of dismantling “net neutrality” rules put in place in 2015."]

"The latest CR Consumer Voices survey reveals that people have been increasingly worried about the issue in 2017. Seventy percent of Americans lack confidence that their personal data is private and safe from distribution without their knowledge, according to the nationally representative survey of 1,007 adults conducted in April.

That number climbed from 65 percent since we first asked about the topic in January.

Respondents to the April survey also said they want more control over what data is collected. Ninety-two percent said that internet service providers, such as Comcast and Verizon, should be required to secure permission from users before selling or sharing their data. [Bold and larger font added for emphasis]

The same proportion thinks consumers should have the right to request a complete list of the data an internet service provider or website has collected about them.

Finally, respondents spoke out about how such data may be used to charge online shoppers different prices for the same goods and services—without consumers knowing about it. This kind of dynamic pricing can be based on factors from age to browsing history to home address. Sixty-five percent of respondents oppose the practice.

Though consumers say they want stronger privacy protections, federal actions are moving the rules in the opposite direction."

Friday, January 20, 2017

Scientists Needn't Get A Patient's Consent To Study Blood Or DNA; NPR, 1/18/17

Rob Stein, NPR; 

Scientists Needn't Get A Patient's Consent To Study Blood Or DNA

"The Obama administration has dropped a controversial proposal that would have required all federally funded scientists to get permission from patients before using their cells, blood, tissue or DNA for research.

The proposal was eliminated from the final revision of the Common Rule, which was published in the Federal Register Wednesday. The rule is a complex set of regulations designed to make sure federally funded research on human subjects is conducted ethically. The revision to the regulations, set to go into effect in 2018, marks the first time the rule has been updated in 26 years.

The initial proposal that researchers be required to get permission before using a patient's tissue sample for research came out of the desire to avoid repeating what happened to Henrietta Lacks, an American who died of cervical cancer in 1951. Some of the cells from Lacks' cancer were kept alive for decades, used in research and for commercial purposes without her consent or her family's knowledge.

But scientists have argued that the mandate for consent in the initial Obama proposal was unnecessary and would hinder crucial research...

The final decision was welcomed by scientists and universities."

Monday, September 5, 2016

Second Thoughts of an Animal Researcher; New York Times, 9/2/16

John P. Gluck, New York Times; Second Thoughts of an Animal Researcher:
"In 1974, a federal commission was formed to develop ethical principles for human research. For nearly four years, the National Commission for the Protection of Human Subjects in Biomedical and Behavioral Research met monthly to develop ethical principles that we rely on for human research. The principles set down in the resulting Belmont Report reflect the moral dimensions of human research that now govern this work. The report revolutionized the understanding of voluntary and informed consent, fair subject recruitment, and the importance of conducting risk-benefit analyses. No such document exists for animal research.
Acknowledging that our serious work as scientists can be a source of pain and distress to sentient, helpless and non-consenting beings can be difficult. The federal government should establish a national commission to develop the principles to guide decisions about the ethics of animal research. We already accept that ethical limits on experiments involving humans are important enough that we are willing to forgo possible breakthroughs. There is no ethical argument that justifies not doing the same for animals."

Wednesday, August 31, 2016

Companies are making money from our personal data – but at what cost?; Guardian, 8/31/16

Jathan Sadowski, Guardian; Companies are making money from our personal data – but at what cost? :
"Data appropriation is a form of exploitation because companies use data to create value without providing people with comparable compensation...
In short, rampant practices of data appropriation allow corporations and governments to build their wealth and power, without the headache of obtaining consent and providing compensation for the resource they desire.
Data appropriation is surely an ethical issue. But by framing it as theft, we can lay the groundwork for policies that also make it a legal issue. We need new models of data ownership and protection that reflect the role information has in society.
In the Gilded Age 2.0, a laissez-faire attitude toward data has encouraged a new class of robber barons to arise. Rather than allow them to unscrupulously take, trade and hoard our data, we must reclaim their ill-gotten gains and reign in the data imperative."

Sunday, March 20, 2016

Privacy versus speech in the Hulk Hogan sex tape trial; Los Angeles Times, 3/14/16

Erwin Chemerinsky, Los Angeles Times; Privacy versus speech in the Hulk Hogan sex tape trial:
"Indeed, this case reflects how the changing notions of privacy in society make it much harder to decide what would be offensive to the reasonable person and what isn't of public concern.
But juries, it's said, make decisions based on emotion, on the gut. Accordingly, St. Petersburg jurors may ultimately find it hard to accept that Gawker's speech rights reach into Bollea's bedroom, notwithstanding the plaintiff's lewd persona. There is a difference, after all, between talking about sex and watching it.
If the jury sides with Bollea, 1st Amendment absolutists will worry about the “chilling effect” the verdict may have on speech, and will claim it's impossible to draw a line between permissible and impermissible expression. Speech is speech.
But I can imagine a clear rule: No videos of people having sex should be made public unless all of the participants consent. I think the media will survive the restriction."