Friday, June 15, 2018

Protests greet Brussels copyright reform plan; BBC News, June 15, 2018

BBC News; Protests greet Brussels copyright reform plan

"The vote on the Copyright Directive comes before the European Parliament on 20 June.

It aims to rebalance copyright controls for the net age but critics say it is will stifle freedom of expression.

Net veterans have signed an open letter against the directive and others have made tools to aid lobbying efforts."

Thursday, June 14, 2018

Expert in Native American intellectual property joins ASU Law Indian Legal Program; Arizona State University, June 11, 2018

Arizona State University; Expert in Native American intellectual property joins ASU Law Indian Legal Program

"In 2007, [Trevor Reed] moved to New York and enrolled at Columbia, beginning a decade-plus of music-inspired study that would result in three master’s degrees, a PhD and a Juris Doctor. He initially went to Columbia hoping to break into the music industry, figuring his best shot at a career in the arts would require being in either New York or Los Angeles.

“When I got there, it opened up so many new issues for me,” Reed said. “It just so happens that Columbia owns this massive archive of Native American musical recordings that I don’t know if anybody had really ever heard about. When I learned about that, it sparked an interest in wanting to return music and other types of archival collections, artifacts and other types of intellectual property back to Native American tribes.”

That led to the Hopi Music Repatriation Project, a joint project of the Hopi Tribe and Columbia University, which Reed began leading as a master’s degree student. Think Indiana Jones, the fictitious archaeologist and university professor, but the complete opposite. Instead of “Raiders of the Lost Ark,” plundering wondrous works from indigenous cultures, it was “Returners of the Lost Art.” The project focused not only on returning recordings and rights, but also working with tribal leaders, educators and activists to develop contemporary uses for the materials.

“I stayed on at Columbia well after my business degree had finished, and I joined the PhD program in ethnomusicology, which is essentially the anthropology of music,” Reed said. “And we just set to work on this project, and it carried through law school, and I was able to refine my work in copyright and cultural property. It’s been an interesting ride.”"

Sunday, June 10, 2018

How data scientists are using AI for suicide prevention; Vox, June 9, 2018

Brian Resnick, Vox; How data scientists are using AI for suicide prevention

"At the Crisis Text Line, a text messaging-based crisis counseling hotline, these deluges have the potential to overwhelm the human staff.

So data scientists at Crisis Text Line are using machine learning, a type of artificial intelligence, to pull out the words and emojis that can signal a person at higher risk of suicide ideation or self-harm. The computer tells them who on hold needs to jump to the front of the line to be helped.

They can do this because Crisis Text Line does something radical for a crisis counseling service: It collects a massive amount of data on the 30 million texts it has exchanged with users. While Netflix and Amazon are collecting data on tastes and shopping habits, the Crisis Text Line is collecting data on despair.

The data, some of which is available here, has turned up all kinds of interesting insights on mental health."

Wednesday, June 6, 2018

DNA testing service MyHeritage says 92 million customer email addresses were exposed; The Washington Post, June 5, 2018

, The Washington Post; DNA testing service MyHeritage says 92 million customer email addresses were exposed

"One of the world's leading DNA-testing companies recently disclosed that a researcher had found on a private server the email addresses and hashed passwords of every customer that had signed up for its service.

MyHeritage said Monday in a blog post that the breach involved roughly 92 million user accounts that were created through October of last year.

The company said the breach occurred on October 26, 2017. But the service did not learn about the incident until Monday, more than seven months later."

When Scientists Develop Products From Personal Medical Data, Who Gets To Profit?; NPR, May 31, 2018

Richard Harris, NPR; When Scientists Develop Products From Personal Medical Data, Who Gets To Profit?

"If you go to the hospital for medical treatment and scientists there decide to use your medical information to create a commercial product, are you owed anything as part of the bargain?

That's one of the questions that is emerging as researchers and product developers eagerly delve into digital data such as CT scans and electronic medical records, making artificial-intelligence products that are helping doctors to manage information and even to help them diagnose disease.

This issue cropped up in 2016, when Google DeepMind decided to test an app that measures kidney health by gathering 1.6 million records from patients at the Royal Free Hospital in London. The British authorities found this broke patient privacy laws in the United Kingdom. (Update on June 1 at 9:30 a.m. ET: DeepMind says it was able to deploy its app despite the violation.)

But the rules are different in the United States."

Monday, June 4, 2018

4 Big Takeaways from Satya Nadella's Talk at Microsoft Build; Fortune,, May 7, 2018

Jonathan Vanian, Fortune; 4 Big Takeaways from Satya Nadella's Talk at Microsoft Build

"Microsoft Believes in AI and Ethics

Nadella briefly mentioned the company’s internal AI ethics team whose job is to ensure that the company’s foray into cutting-edge techniques like deep learning don’t unintentionally perpetuate societal biases in their products, among other tasks.

 He said that coders need to concentrate on building products that use “good A.I.,” in which the “the choices we make can be good choices for the future.”

Expect more technology companies to talk about AI and ethics as a way to alleviate concerns from the public about the tech industry’s insatiable appetite for data."

Stanford makes a startling new discovery. Ethics; ZDNet, June 4, 2018

, ZDNet; Stanford makes a startling new discovery. Ethics

"Now, in a glorious moment of chest-beating and head-bobbing, Stanford University president Marc Tessier-Lavigne has admitted that his university -- which spawned so many young, great tech titans, such as the founders of Google, Instagram and LinkedIn -- failed to make titanic efforts in the area of ethics. 

In an interview with the Financial Times, he revealed that the university now intends to explore the teaching of "ethics, society and technology."

As we survey the political and social carnage that seems to have been enabled by technology over the last few years, it's remarkable that this wasn't thought of before."

Stanford to step-up teaching of ethics in technology; Financial Times, June 3, 2018

Financial Times; Stanford to step-up teaching of ethics in technology

"The university at the heart of Silicon Valley is to inject ethics into its technology teaching and research amid growing criticism of the excesses of the industry it helped spawn.

The board of Stanford University, one of the world’s richest higher education institutions with an endowment of $27bn, will meet this month to agree funding and a plan to implement the findings of an internal review that recommends a new initiative focused on “ethics, society and technology” and improved access to those on lower incomes."

China Issues Rules to Get Tough on Academic Integrity; Reuters, May 30, 2018

Reuters via New York Times; China Issues Rules to Get Tough on Academic Integrity

"China has issued new guidelines to enforce academic integrity in science that include plans to "record and assess" the conduct of scientists and institutions and punish anyone guilty of misconduct, state news agency Xinhua reported.

The guidelines, released on Wednesday by the ruling Communist Party and the State Council, or cabinet, prohibit plagiarism, fabrication of data and research conclusions, ghost-writing and peer review manipulation, according to Xinhua."

Thursday, May 31, 2018

Issue Brief: The General Data Protection Regulation: What Does It Mean for Libraries Worldwide?; University of North Carolina at Chapel Hill via Association of Research Libraries, May 2018

Anne T. Gilliland, Scholarly Communications Officer, University Libraries, University of North Carolina at Chapel Hill  via Association of Research Libraries; Issue Brief: The General Data Protection Regulation: What Does It Mean forLibraries Worldwide?

"Although GDPR is an EU regulation, it has implications for businesses and institutions that collect data even outside the EU. Anne T. Gilliland, scholarly communications officer at the University of North Carolina at Chapel Hill Libraries, explains some of the key provisions of GDPR and why its impact reaches worldwide. Gilliland notes that the research library community has ties to Europe and EU citizens. Libraries must therefore consider the implications GDPR will have on their own privacy policies and how to ensure compliance with these new rules. As staunch defenders of privacy rights, libraries have an opportunity to ensure robust protection of users’ rights. Because GDPR has not yet gone into effect, there is no case law or other binding guidance regarding GDPR compliance.

The Association of Research Libraries will continue to monitor developments on GDPR and will publish a follow-up piece focusing on implementation. In the meantime, the following resources may be useful:

• EU’s GDPR Information Portal 

• Library of Congress, “Online Privacy Law: European Union” 

• LIBER, Webinar Video: “GDPR & What It Means for Researchers”"

New Institute Aims for Global Leadership in Computer Modeling and Simulation; PittWire, May 30, 2018

PittWire; New Institute Aims for Global Leadership in Computer Modeling and Simulation

"At Pitt, the plan is to pair AI and machine learning researchers with individuals from academia, industry, nonprofits and the government to develop algorithms designed to address their specific problems and to use modeling experiments to provide concrete solutions.

“One day, presidents and cabinet officers, C-suites and lab directors will say, ‘Don’t tell me what your gut says, tell me what the evidence says; show me your models, show me the possible futures and the best interventions,’” said [Paul] Cohen."

Why Are Academics Upset With Facebook's New Privacy Rules?; Forbes, May 4, 2018

Kalev Leetaru, Forbes; Why Are Academics Upset With Facebook's New Privacy Rules?

"Putting this all together, there is something inherently wrong with a world in which academics condemn Facebook for conducting consent-free research on its users, only to turn around and condemn the company again when it tries to institute greater privacy protections that would prevent academics from doing the same, all while those very same academics partner with Facebook to create a new research initiative that entirely removes consent from the equation and where ethical considerations are unilaterally TBD, to be figured out after researchers decide what they want to do with two billion people’s private information. Cambridge University’s ethics panel gives us hope that there are still some institutions that believe in the ethical protections that took decades to build, only to fall like dominoes in the digital “big data” era. In the end, it is not just the social media giants and private companies rushing to commercialize our digital selves and stave off any discussion of privacy protections – the academic community is running right alongside helping to clear the way."

An American Alternative to Europe’s Privacy Law; The New York Times, May 30, 2018

Tim Wu, The New York Times; An American Alternative to Europe’s Privacy Law

"To be sure, a European-style regulatory system operates faster and has clearer rules than an American-style common-law approach. But the European approach runs the risk of being insensitive to context and may not match our ethical intuitions in individual cases. If the past decade of technology has taught us anything, it is that we face a complex and varied array of privacy problems. Case-by-case consideration might be the best way to find good solutions to many of them and, when the time comes (ifthe time comes), to guide the writing of general federal privacy legislation.

A defining fact of our existence today is that we share more of ourselves with Silicon Valley than with our accountants, lawyers and doctors. It is about time the law caught up with that."

[Podcast] "Roseanne" and ethics in business; Marketplace, May 29, 2018

[Podcast] Kai Ryssdal and Molly Wood, Marketplace, "Roseanne" and ethics in business

"When we called up business ethicist Greg Fairchild from the University of Virginia this morning, we expected to have a wide-ranging conversation to get at Kai's question of a few weeks back: Are there market-based solutions to ensure better ethics? We didn't expect we'd have such a timely case study in Disney-owned ABC and "Roseanne." The network canceled its show hours after a racist tweet from star Roseanne Barr. We got the Darden School of Business professor's reaction as the news played out in the way it always seems to these days: fast, furious and on Twitter."

How a Pentagon Contract Became an Identity Crisis for Google; The New York Times, May 30, 2018

Scott Shane, Cade Metz and Daisuke Wakabayashi, The New York Times; How a Pentagon Contract Became an Identity Crisis for Google

"The polarized debate about Google and the military may leave out some nuances. Better analysis of drone imagery could reduce civilian casualties by improving operators’ ability to find and recognize terrorists. The Defense Department will hardly abandon its advance into artificial intelligence if Google bows out. And military experts say China and other developed countries are already investing heavily in A.I. for defense.

But skilled technologists who chose Google for its embrace of benign and altruistic goals are appalled that their employer could eventually be associated with more efficient ways to kill."

Tuesday, May 29, 2018

ABC just took a moral stand on Roseanne. Spoiler alert: Donald Trump won't.; CNN, May 29, 2018

Chris Cillizza, CNN; ABC just took a moral stand on Roseanne. Spoiler alert: Donald Trump won't.

"ABC's decision to cancel Roseanne Barr's eponymous show following a racist comment she made about former Obama administration official Valerie Jarrett on Twitter was shocking for two reasons.

First, because it amounted to a TV network drawing a moral line in the sand -- insisting that no amount of money or ratings gave Roseanne the right to express views that ABC described in a statement as "abhorrent, repugnant and inconsistent with our values."

Second, because that decision to take a moral stand represents a stark contrast from the moral relativism preached by the president of the United States.

Donald Trump is different from anyone who has held the office before him in all sorts of ways. But, to my mind, the biggest -- and most critical -- difference between Trump and his predecessors is his total abdication of the concept of the president as a moral leader for the country and the world."

Why thousands of AI researchers are boycotting the new Nature journal ; Guardian, May 29, 2018

Neil Lawrence, Guardian;
Many in our research community see the Nature brand as a poor proxy for academic quality. We resist the intrusion of for-profit publishing into our field. As a result, at the time of writing, more than 3,000 researchers, including many leading names in the field from both industry and academia, have signed a statement refusing to submit, review or edit for this new journal. We see no role for closed access or author-fee publication in the future of machine-learning research. We believe the adoption of this new journal as an outlet of record for the machine-learning community would be a retrograde step."

"Biometric Privacy Laws: Best Practices for Compliance and Litigation Update"; American Bar Association Continuing Legal Education Webinar, May 30, 2018 1 PM - 2 PM ET

American Bar Association Continuing Legal Education Webinar

"Biometric Privacy Laws: Best Practices for Compliance and Litigation Update
ABA Value Pass
1.00 CLE
Format:
Webinar
Date:
May 30, 2018
Time:
1:00 PM - 2:00 PM ET
Add to Calendar
Credits:
1.00 General CLE Credit Hours
Learn about the current state of biometrics litigation under Illinois and other state laws and the future of biometric privacy law.
  • List Price:$150.00
  • ABA Member Price:$100.00
  • Sponsor Member Price:$65.00
Want to save more?
to see if you qualify for a lower rate.
Members
save $50.00 or more

The past 18 months have seen a major spike in class action lawsuits alleging that companies have improperly collected and handled biometric information, the vast majority of which asserted claims under the Illinois Biometric Information Privacy Act (BIPA). In this program, a panel that includes attorneys representing employers, as well as both defendants and plaintiffs in litigation, will discuss the current state of biometrics litigation under BIPA and other state laws, what companies should do to comply, and what recent legal trends portend for the future of biometric privacy law."

Controversy Hides Within US Copyright Bill; Intellectual Property Watch, May 29, 2018

Steven Seidenberg, Intellectual Property Watch; Controversy Hides Within US Copyright Bill

"In a time when partisanship runs wild in the USA and the country’s political parties can’t seem to agree on anything, the Music Modernization Act is exceptional. The MMA passed the House of Representatives on 25 April with unanimous support. And for good reason. Almost all the major stakeholders back this legislation, which will bring some badly needed changes to copyright law’s treatment of music streaming. But wrapped in the MMA is a previously separate bill – the CLASSICS Act – that has been attacked by many copyright law experts, is opposed by many librarians and archivists, and runs counter to policy previously endorsed by the US Copyright Office."

The Demise Of Copyright Toleration; Techdirt, May 24, 2018

Robert S. Schwartz, Techdirt; The Demise Of Copyright Toleration

"Although denying fair use, these content owners were acknowledging a larger truth about copyright, the Internet, and even the law in general: It works largely due to toleration. Not every case is clear; not every outcome can be enforced; and not every potential legal outcome can be endured. Instead, “grey area” conduct must be impliedly licensed, or at least tolerated.

Counsel then or now could not have cited a single court holding on whether the private, noncommercial recording of a song is a lawful fair use. Long before the Supreme Court in Sony Corp. of America v. Universal City Studios, Inc. said that video home recording from broadcasts as a fair use, the music industry could have pursued consumers for home audio recording from vinyl records. But the risk of losing and establishing a bad precedent was too great.

Toleration endured because fair use, and the practicalities of enforcement, had to be endured by content owners. They recognized that their own creative members also relied on fair use in adapting and building on the works of contemporaries as well as earlier generations. They also realized that offending consumers by suing them might not be a good idea – a reason (in addition to the possibility of losing) why the Sony plaintiffs dropped the individual consumer defendants they had originally named."

Friday, May 25, 2018

Schools See Steep Drop in Librarians, New Analysis Finds; Education Week, May 16, 2018

and , Education Week; Schools See Steep Drop in Librarians, New Analysis Finds

"“When we’ve talked to districts that have chosen to put resources elsewhere, we really do see more than one who have then come back and wanted to reinstate [the librarian],” said Steven Yates, the president of the American Association of School Librarians. “Not only do you lose the person curating the resources for informational and pleasure reading, but you lose the person who can work with the students on the ethical side—how do you cite? How do you determine a credible source of information?”"

GDPR: US news sites blocked to EU users over data protection rules; BBC, May 25, 2018

BBC; GDPR: US news sites blocked to EU users over data protection rules

"The Chicago Tribune and LA Times were among those posting messages saying they were currently unavailable in most European countries.

The General Data Protection Regulation (GDPR) gives EU citizens more rights over how their information is used.

The measure is an effort by EU lawmakers to limit tech firms' powers."

Why Every Media Company Fears Richard Liebowitz; Slate, May 24, 2018

Justin Peters, Slate; Why Every Media Company Fears Richard Liebowitz

"Key to Liebowitz’s strategy is the pursuit of statutory damages. Under the Copyright Act of 1976, federal plaintiffs can be awarded statutory damages if they can prove “willful” infringement, a term that is not explicitly defined in the text of the bill. (“What is willful infringement? It’s what the courts say it is,” explained Adwar. Welcome to the wonderfully vague world of copyright law!) If a plaintiff had registered the work in question with the Copyright Office before the infringement occurred or up to three months after the work was initially published, then he or she can sue for statutory damages, which can be as high as $150,000 per work infringed. That’s a pretty hefty potential fine for the unauthorized use of a photograph that, if it had been licensed prior to use, might not have earned the photographer enough for a crosstown taxi.

“Photographers are basically small businesses. They’re little men. But you have this powerful tool, which is copyright law,” said Kim, the freelance photographer. The question that copyright attorneys, media executives, and federal judges have been asking themselves for 2½ years is this: Is Richard Liebowitz wielding that tool responsibly? “He offers [his clients] nirvana, basically. He essentially offers them: I will sue for you, I don’t care how innocuous the infringement, I don’t care how innocuous the photograph, I will bring that lawsuit for you and get you money,” said attorney Kenneth Norwick. And the law allows him to do it. So is Liebowitz gaming the system by filing hundreds of “strike suits” to compel quick settlements? Or is he an avenging angel for photographers who have seen their livelihoods fade in the internet age? “They can call Richard Liebowitz a troll,” said Kim. “Better to be a troll than a thief.”...

Over the past 2½ years, Liebowitz has attained boogeyman status in the C-suites of major media organizations around the country. Like the villain in a very boring horror movie featuring content management systems and starring bloggers, his unrelenting litigiousness has inspired great frustration amongst editors and media lawyers fearful that they will be the next to fall victim to the aggravating time-suck known as a Richard Liebowitz lawsuit. And he is probably all of the things his detractors say he is: a troll, an opportunist, a guy on the make taking advantage of the system. He is also a creature of the media industry’s own making, and the best way to stop him and his disciples is for media companies to stop using photographers’ pictures without paying for them—and to minimize the sorts of editorial mistakes borne out of ignorance of or indifference to federal copyright law. “People should realize—and hopefully will continue to realize,” said Liebowitz, “that photographers need to be respected and get paid for their work.”"

Thursday, May 24, 2018

New privacy rules could spell the end of legalese — or create a lot more fine print; The Washington Post, May 24, 2018

Elizabeth DwoskinThe Washington Post; New privacy rules could spell the end of legalese — or create a lot more fine print

"“The companies are realizing that it is not enough to get people to just click through,” said Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University and the U.S. Federal Trade Commission’s former chief technologist. “That they need to communicate so that people are not surprised when they find out what they consented to.”

That has become more apparent in the past two months since revelations that a Trump-connected consultancy, Cambridge Analytica, made off with the Facebook profiles of up to 87 million Americans. Cranor said that consumer outrage over Cambridge was directly related to concerns that companies were engaging in opaque practices behind the scenes, and that consumers had unknowingly allowed it to happen by signing away their rights.

Irrespective of simpler explanations, the impact and success of the GDPR will hinge upon whether companies will try to force users to consent to their tracking or targeting as condition for access to their services, said Alessandro Acquisti, a Carnegie Mellon computer science professor and privacy researcher. "This will tell us a lot regarding whether the recent flurry of privacy policy modifications demonstrates a sincere change in the privacy stance of those companies or is more about paying lip service to the new regulation. The early signs are not auspicious.""

Why you’re getting so many emails about privacy policies; Vox, May 24, 2018

Emily Stewart, Vox; Why you’re getting so many emails about privacy policies

"The United States hasn’t given up its seat on the table, but it could certainly take a bigger role than it has in order to ensure that other countries, when they do implement regulations on tech and information, aren’t going too far.

“People are concerned about privacy, hate speech, disinformation, and we aren’t leading on solutions to these concerns that would at the same time preserve the free flow of information,” Kornbluh said. “You don’t want some governments saying, ‘We’re combating fake news,’ and compromising human rights.”"

Wednesday, May 23, 2018

No one’s ready for GDPR; The Verge, May 22, 2018

Sarah Jeong, The Verge; No one’s ready for GDPR

"The General Data Protection Regulation will go into effect on May 25th, and no one is ready — not the companies and not even the regulators...

GDPR is only supposed to apply to the EU and EU residents, but because so many companies do business in Europe, the American technology industry is scrambling to become GDPR compliant. Still, even though GDPR’s big debut is bound to be messy, the regulation marks a sea change in how data is handled across the world. Americans outside of Europe can’t make data subject access requests, and they can’t demand that their data be deleted. But GDPR compliance is going to have spillover effects for them anyway. The breach notification requirement, especially, is more stringent than anything in the US. The hope is that as companies and regulatory bodies settle into the flow of things, the heightened privacy protections of GDPR will become business as usual. In the meantime, it’s just a mad scramble to keep up."

Ethics and tech – a double-edged sword; Computer Weekly, May 2018

James Kitching, Computer Weekly; Ethics and tech – a double-edged sword

"Big corporations can no longer afford to ignore ethics in their decision-making. Customers expect a higher level of social capital from the companies they deal with and this can have a big effect on whether those companies succeed or fail.

This is not a new conundrum specific to tech – remember the UK hearings relating to tax avoidance, which included the likes of Starbucks as well as Google. What accountants were advising their clients wasn’t illegal. The creative schemes they came up with were allowed under UK law – but that didn’t matter. What mattered was that the way they were dealing with tax was seen by the public and the media as immoral and unethical.

Organisations must think beyond the black-and-white letter of the law. In the current climate, this means saying: “Yes, this is legal, but I don’t necessarily think it is going to be viewed as socially acceptable.”

 Gone are the days when the excuse “but it is legal” will wash with the media, the government and the public at large."

Monday, May 21, 2018

How the Enlightenment Ends; The Atlantic, June 2018 Issue

Henry A. Kissinger, The Atlantic; How the Enlightenment Ends

 

"Heretofore confined to specific fields of activity, AI research now seeks to bring about a “generally intelligent” AI capable of executing tasks in multiple fields. A growing percentage of human activity will, within a measurable time period, be driven by AI algorithms. But these algorithms, being mathematical interpretations of observed data, do not explain the underlying reality that produces them. Paradoxically, as the world becomes more transparent, it will also become increasingly mysterious. What will distinguish that new world from the one we have known? How will we live in it? How will we manage AI, improve it, or at the very least prevent it from doing harm, culminating in the most ominous concern: that AI, by mastering certain competencies more rapidly and definitively than humans, could over time diminish human competence and the human condition itself as it turns it into data...

The Enlightenment started with essentially philosophical insights spread by a new technology. Our period is moving in the opposite direction. It has generated a potentially dominating technology in search of a guiding philosophy. Other countries have made AI a major national project. The United States has not yet, as a nation, systematically explored its full scope, studied its implications, or begun the process of ultimate learning. This should be given a high national priority, above all, from the point of view of relating AI to humanistic traditions.

AI developers, as inexperienced in politics and philosophy as I am in technology, should ask themselves some of the questions I have raised here in order to build answers into their engineering efforts. The U.S. government should consider a presidential commission of eminent thinkers to help develop a national vision. This much is certain: If we do not start this effort soon, before long we shall discover that we started too late."