Tuesday, May 29, 2018

Controversy Hides Within US Copyright Bill; Intellectual Property Watch, May 29, 2018

Steven Seidenberg, Intellectual Property Watch; Controversy Hides Within US Copyright Bill

"In a time when partisanship runs wild in the USA and the country’s political parties can’t seem to agree on anything, the Music Modernization Act is exceptional. The MMA passed the House of Representatives on 25 April with unanimous support. And for good reason. Almost all the major stakeholders back this legislation, which will bring some badly needed changes to copyright law’s treatment of music streaming. But wrapped in the MMA is a previously separate bill – the CLASSICS Act – that has been attacked by many copyright law experts, is opposed by many librarians and archivists, and runs counter to policy previously endorsed by the US Copyright Office."

The Demise Of Copyright Toleration; Techdirt, May 24, 2018

Robert S. Schwartz, Techdirt; The Demise Of Copyright Toleration

"Although denying fair use, these content owners were acknowledging a larger truth about copyright, the Internet, and even the law in general: It works largely due to toleration. Not every case is clear; not every outcome can be enforced; and not every potential legal outcome can be endured. Instead, “grey area” conduct must be impliedly licensed, or at least tolerated.

Counsel then or now could not have cited a single court holding on whether the private, noncommercial recording of a song is a lawful fair use. Long before the Supreme Court in Sony Corp. of America v. Universal City Studios, Inc. said that video home recording from broadcasts as a fair use, the music industry could have pursued consumers for home audio recording from vinyl records. But the risk of losing and establishing a bad precedent was too great.

Toleration endured because fair use, and the practicalities of enforcement, had to be endured by content owners. They recognized that their own creative members also relied on fair use in adapting and building on the works of contemporaries as well as earlier generations. They also realized that offending consumers by suing them might not be a good idea – a reason (in addition to the possibility of losing) why the Sony plaintiffs dropped the individual consumer defendants they had originally named."

Friday, May 25, 2018

Schools See Steep Drop in Librarians, New Analysis Finds; Education Week, May 16, 2018

and , Education Week; Schools See Steep Drop in Librarians, New Analysis Finds

"“When we’ve talked to districts that have chosen to put resources elsewhere, we really do see more than one who have then come back and wanted to reinstate [the librarian],” said Steven Yates, the president of the American Association of School Librarians. “Not only do you lose the person curating the resources for informational and pleasure reading, but you lose the person who can work with the students on the ethical side—how do you cite? How do you determine a credible source of information?”"

GDPR: US news sites blocked to EU users over data protection rules; BBC, May 25, 2018

BBC; GDPR: US news sites blocked to EU users over data protection rules

"The Chicago Tribune and LA Times were among those posting messages saying they were currently unavailable in most European countries.

The General Data Protection Regulation (GDPR) gives EU citizens more rights over how their information is used.

The measure is an effort by EU lawmakers to limit tech firms' powers."

Why Every Media Company Fears Richard Liebowitz; Slate, May 24, 2018

Justin Peters, Slate; Why Every Media Company Fears Richard Liebowitz

"Key to Liebowitz’s strategy is the pursuit of statutory damages. Under the Copyright Act of 1976, federal plaintiffs can be awarded statutory damages if they can prove “willful” infringement, a term that is not explicitly defined in the text of the bill. (“What is willful infringement? It’s what the courts say it is,” explained Adwar. Welcome to the wonderfully vague world of copyright law!) If a plaintiff had registered the work in question with the Copyright Office before the infringement occurred or up to three months after the work was initially published, then he or she can sue for statutory damages, which can be as high as $150,000 per work infringed. That’s a pretty hefty potential fine for the unauthorized use of a photograph that, if it had been licensed prior to use, might not have earned the photographer enough for a crosstown taxi.

“Photographers are basically small businesses. They’re little men. But you have this powerful tool, which is copyright law,” said Kim, the freelance photographer. The question that copyright attorneys, media executives, and federal judges have been asking themselves for 2½ years is this: Is Richard Liebowitz wielding that tool responsibly? “He offers [his clients] nirvana, basically. He essentially offers them: I will sue for you, I don’t care how innocuous the infringement, I don’t care how innocuous the photograph, I will bring that lawsuit for you and get you money,” said attorney Kenneth Norwick. And the law allows him to do it. So is Liebowitz gaming the system by filing hundreds of “strike suits” to compel quick settlements? Or is he an avenging angel for photographers who have seen their livelihoods fade in the internet age? “They can call Richard Liebowitz a troll,” said Kim. “Better to be a troll than a thief.”...

Over the past 2½ years, Liebowitz has attained boogeyman status in the C-suites of major media organizations around the country. Like the villain in a very boring horror movie featuring content management systems and starring bloggers, his unrelenting litigiousness has inspired great frustration amongst editors and media lawyers fearful that they will be the next to fall victim to the aggravating time-suck known as a Richard Liebowitz lawsuit. And he is probably all of the things his detractors say he is: a troll, an opportunist, a guy on the make taking advantage of the system. He is also a creature of the media industry’s own making, and the best way to stop him and his disciples is for media companies to stop using photographers’ pictures without paying for them—and to minimize the sorts of editorial mistakes borne out of ignorance of or indifference to federal copyright law. “People should realize—and hopefully will continue to realize,” said Liebowitz, “that photographers need to be respected and get paid for their work.”"

Thursday, May 24, 2018

New privacy rules could spell the end of legalese — or create a lot more fine print; The Washington Post, May 24, 2018

Elizabeth DwoskinThe Washington Post; New privacy rules could spell the end of legalese — or create a lot more fine print

"“The companies are realizing that it is not enough to get people to just click through,” said Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University and the U.S. Federal Trade Commission’s former chief technologist. “That they need to communicate so that people are not surprised when they find out what they consented to.”

That has become more apparent in the past two months since revelations that a Trump-connected consultancy, Cambridge Analytica, made off with the Facebook profiles of up to 87 million Americans. Cranor said that consumer outrage over Cambridge was directly related to concerns that companies were engaging in opaque practices behind the scenes, and that consumers had unknowingly allowed it to happen by signing away their rights.

Irrespective of simpler explanations, the impact and success of the GDPR will hinge upon whether companies will try to force users to consent to their tracking or targeting as condition for access to their services, said Alessandro Acquisti, a Carnegie Mellon computer science professor and privacy researcher. "This will tell us a lot regarding whether the recent flurry of privacy policy modifications demonstrates a sincere change in the privacy stance of those companies or is more about paying lip service to the new regulation. The early signs are not auspicious.""

Why you’re getting so many emails about privacy policies; Vox, May 24, 2018

Emily Stewart, Vox; Why you’re getting so many emails about privacy policies

"The United States hasn’t given up its seat on the table, but it could certainly take a bigger role than it has in order to ensure that other countries, when they do implement regulations on tech and information, aren’t going too far.

“People are concerned about privacy, hate speech, disinformation, and we aren’t leading on solutions to these concerns that would at the same time preserve the free flow of information,” Kornbluh said. “You don’t want some governments saying, ‘We’re combating fake news,’ and compromising human rights.”"

Wednesday, May 23, 2018

No one’s ready for GDPR; The Verge, May 22, 2018

Sarah Jeong, The Verge; No one’s ready for GDPR

"The General Data Protection Regulation will go into effect on May 25th, and no one is ready — not the companies and not even the regulators...

GDPR is only supposed to apply to the EU and EU residents, but because so many companies do business in Europe, the American technology industry is scrambling to become GDPR compliant. Still, even though GDPR’s big debut is bound to be messy, the regulation marks a sea change in how data is handled across the world. Americans outside of Europe can’t make data subject access requests, and they can’t demand that their data be deleted. But GDPR compliance is going to have spillover effects for them anyway. The breach notification requirement, especially, is more stringent than anything in the US. The hope is that as companies and regulatory bodies settle into the flow of things, the heightened privacy protections of GDPR will become business as usual. In the meantime, it’s just a mad scramble to keep up."

Ethics and tech – a double-edged sword; Computer Weekly, May 2018

James Kitching, Computer Weekly; Ethics and tech – a double-edged sword

"Big corporations can no longer afford to ignore ethics in their decision-making. Customers expect a higher level of social capital from the companies they deal with and this can have a big effect on whether those companies succeed or fail.

This is not a new conundrum specific to tech – remember the UK hearings relating to tax avoidance, which included the likes of Starbucks as well as Google. What accountants were advising their clients wasn’t illegal. The creative schemes they came up with were allowed under UK law – but that didn’t matter. What mattered was that the way they were dealing with tax was seen by the public and the media as immoral and unethical.

Organisations must think beyond the black-and-white letter of the law. In the current climate, this means saying: “Yes, this is legal, but I don’t necessarily think it is going to be viewed as socially acceptable.”

 Gone are the days when the excuse “but it is legal” will wash with the media, the government and the public at large."

Monday, May 21, 2018

How the Enlightenment Ends; The Atlantic, June 2018 Issue

Henry A. Kissinger, The Atlantic; How the Enlightenment Ends

 

"Heretofore confined to specific fields of activity, AI research now seeks to bring about a “generally intelligent” AI capable of executing tasks in multiple fields. A growing percentage of human activity will, within a measurable time period, be driven by AI algorithms. But these algorithms, being mathematical interpretations of observed data, do not explain the underlying reality that produces them. Paradoxically, as the world becomes more transparent, it will also become increasingly mysterious. What will distinguish that new world from the one we have known? How will we live in it? How will we manage AI, improve it, or at the very least prevent it from doing harm, culminating in the most ominous concern: that AI, by mastering certain competencies more rapidly and definitively than humans, could over time diminish human competence and the human condition itself as it turns it into data...

The Enlightenment started with essentially philosophical insights spread by a new technology. Our period is moving in the opposite direction. It has generated a potentially dominating technology in search of a guiding philosophy. Other countries have made AI a major national project. The United States has not yet, as a nation, systematically explored its full scope, studied its implications, or begun the process of ultimate learning. This should be given a high national priority, above all, from the point of view of relating AI to humanistic traditions.

AI developers, as inexperienced in politics and philosophy as I am in technology, should ask themselves some of the questions I have raised here in order to build answers into their engineering efforts. The U.S. government should consider a presidential commission of eminent thinkers to help develop a national vision. This much is certain: If we do not start this effort soon, before long we shall discover that we started too late."

Friday, May 18, 2018

States Offer Information Resources: 50+ Open Data Portals; Forbes, April 30, 2018

Meta S. Brown, Forbes; States Offer Information Resources: 50+ Open Data Portals

"The United States federal open data portal, data.gov, launched in May, 2009, with just 47 datasets. It was not an instant hit.

 Today, with more 200,000 datasets, it’s a lot more popular. Still, real-life demands for information about our governments, people and economy exceed the supply of available data.

The creation of a centralized portal for federal government data has fostered open data initiative across the country. Dozens of cities have established their own open data portals (here are 90 examples).

 In the 50 years since the federal Freedom of Information Act was passed, US states have been gradually introducing similar laws (see freedom of information laws by state). Likewise, many are now developing state-level open data portals.


These state data resources vary in style and depth. Some look much like data.gov, and include a wide variety of datasets. But not every state has a comprehensive data portal yet, let alone deep selections of data.

Here’s a listing of general and geographic open data portals for US states, plus the District of Columbia and Puerto Rico..."

Thursday, May 17, 2018

New Guidelines For Tech Companies To Be Transparent, Accountable On Censoring User Content; Intellectual Property Watch, May 7, 2018

Intellectual Property Watch; New Guidelines For Tech Companies To Be Transparent, Accountable On Censoring User Content

"The Electronic Frontier Foundation (EFF) called on Facebook, Google, and other social media companies today to publicly report how many user posts they take down, provide users with detailed explanations about takedowns, and implement appeals policies to boost accountability.

EFF, ACLU of Northern California, Center for Democracy & Technology, New America’s Open Technology Institute, and a group of academic experts and free expression advocates today released the Santa Clara Principles, a set of minimum standards for tech companies to augment and strengthen their content moderation policies. The plain language, detailed guidelines call for disclosing not just how and why platforms are removing content, but how much speech is being censored. The principles are being released in conjunction with the second edition of the Content Moderation and Removal at Scale conference. Work on the principles began during the first conference, held in Santa Clara, California, in February.

“Our goal is to ensure that enforcement of content guidelines is fair, transparent, proportional, and respectful of users’ rights,” said EFF Senior Staff Attorney Nate Cardozo."

We Need Chief Ethics Officers More Than Ever; Forbes, May 16, 2018

Dan Pontefract, Forbes; We Need Chief Ethics Officers More Than Ever

"It is from the medical community that the high-tech community may learn its greatest lesson.
Create a Chief Ethics Officer role, and an in-house ethics team made up not only of lawyers but educators, philosophers, doctors, psychologists, sociologists, and artists.
Furthermore, as universities such as Carnegie Mellon University begin introducing undergraduate degrees in artiticial intelligence, ensure the program has a strong ethics component throughout the entire curriculum.
Only then—when ethics is outside of the compliance department and it is interwoven into academic pedagogy—will society be in a better place to stem the tide of potentially unwanted, technological advances."

MIT Now Has a Humanist Chaplain to Help Students With the Ethics of Tech; The Atlantic, May 16, 2018

Isabel Fattal, The Atlantic; MIT Now Has a Humanist Chaplain to Help Students With the Ethics of Tech

"Even some of the most powerful tech companies start out tiny, with a young innovator daydreaming about creating the next big thing. As today’s tech firms receive increased moral scrutiny, it raises a question about tomorrow’s: Is that young person thinking about the tremendous ethical responsibility they’d be taking on if their dream comes true?

Greg Epstein, the recently appointed humanist chaplain at MIT, sees his new role as key to helping such entrepreneurial students think through the ethical ramifications of their work. As many college students continue to move away from organized religion, some universities have appointed secular chaplains like Epstein to help non-religious students lead ethical, meaningful lives. At MIT, Epstein plans to spark conversations about the ethics of technology—conversations that will sometimes involve religious groups on campus, and that may sometimes carry over to Harvard, where he has held (and will continue to hold) the same position since 2005.

I recently spoke with Epstein about how young people can think ethically about going into the tech industry and what his role will look like..."

The tragedy of ‘deaccessioning’ books from university libraries; ABA Journal, May 2018

Bryan A. Garner, ABA Journal; The tragedy of ‘deaccessioning’ books from university libraries

"Book research is well-nigh irreplaceable to the skillful researcher. It can’t, and shouldn’t, be fully superseded by online research, which of course has its own splendors but also its own limitations.

So it’s disheartening to hear what’s happening to our libraries. A lead Associated Press article on Feb. 7 reports that “as students abandon the stacks in favor of online reference material, university libraries are unloading millions of unread volumes in a nationwide purge.” Some books are being hauled off to permanent storage sites; others are being sold en bloc to used-book dealers; and still others are being thrown into dumpsters.


Given that half the library collection at the Indiana University of Pennsylvania has been “uncirculated for 20 years or more,” university administrators decided to purge 170,000 volumes. “Bookshelves are making way for group-study rooms and tutoring centers, ‘makerspaces’ and coffee shops,” the article reports. Oregon State University librarian Cheryl Middleton, president of the Association of College and Research Libraries, is quoted as saying, “We’re kind of like the living room of the campus. We’re not just a warehouse.”

Traditional scholars are outraged. One calls this jettisoning of books “a knife through the heart.” He’s right, of course."

Why ‘Fahrenheit 451’ Is the Book for Our Social Media Age; The New York Times, May 10, 2018

Ramin Bahrani, The New York Times;  

Why ‘Fahrenheit 451’ Is the Book for Our Social Media Age


[Kip Currier: Looking forward to seeing this May 19th-debuting HBO adaptation of Ray Bradbury's ever-timely Fahrenheit 451 cautionary intellectual freedom tale, starring Michael B. Jordan as a book-burning-fireman-turned-book-preserver.]



"Bradbury believed that we wanted the world to become this way. That we asked for the firemen to burn books. That we wanted entertainment to replace reading and thinking. That we voted for political and economic systems to keep us happy rather than thoughtfully informed. He would say that we chose to give up our privacy and freedom to tech companies. That we decided to entrust our cultural heritage and knowledge to digital archives. The greatest army of firemen will be irrelevant in the digital world. They will be as powerless as spitting babies next to whoever controls a consolidated internet. How could they stop one person, hiding in his parents’ basement with a laptop, from hacking into thousands of years of humanity’s collective history, literature and culture, and then rewriting all of it … or just hitting delete?

And who would notice?"

Monday, May 14, 2018

How copyright law hides work like Zora Neale Hurston’s new book from the public; The Washington Post, May 7, 2018

Ted Genoways, The Washington Post; How copyright law hides work like Zora Neale Hurston’s new book from the public

"Now, according to the Vulture introduction, the Zora Neale Hurston Trust has new representation, interested in getting unpublished works into print and monetizing those archives. That’s great, from a reader’s perspective, but it also reveals a larger problem where scholarship of literature between World War I and II is concerned. It’s mostly due to the Walt Disney Co.’s efforts to protect ownership of a certain cartoon mouse. Over the years, the company has successfully worked to extend copyright restrictions far beyond the limits ever intended by the original authors of America’s intellectual property laws. Under the original Copyright Act of 1790, a work could be protected for 14 years, renewable for another 14-year term if the work’s author was still alive. In time, the maximum copyright grew from 28 years to 56 years and then to 75 years. In 1998, Sonny Bono championed an extension that would protect works created after 1978 for 70 years after the death of the author and the copyright of works created after 1922 to as long as 120 years.


This worked out great for Disney — which, not coincidentally, was founded in 1923 — but less so for the reputations of authors who produced important work between the 1920s and 1950s. Because copyright law became such a tangle, many of these works have truly languished. Here, Hurston is the rule rather than the exception. I have a file that I’ve kept over the years of significant unpublished works by well-known writers from the era: William Faulkner, Langston Hughes, William Carlos Williams, Hart Crane, Sherwood Anderson and Weldon Kees, among others. The works aren’t really “lost,” of course, but they are tied up in a legal limbo. Because of the literary reputations of those writers, their unpublished works will eventually see the light of day — whenever their heirs decide that the royalties are spreading a little too thin and there’s money to be made from new works. But other important writers who are little-known or unknown will remain so because they don’t have easily identifiable heirs — or, worse, because self-interested, or even uninterested executors, control their estates."

Tuesday, May 1, 2018

Anthropology grad students bring Ethics Bowl home; Cornell Chronicle, May 1, 2018

Yvette Lisa Ndlovu, Cornell Chronicle; Anthropology grad students bring Ethics Bowl home

"Cornell’s team won the Society for American Archaeology Ethics Bowl April 12 in Washington, D.C. Cornell was making its first appearance in the competition, which has been held for 14 years.

The Ethics Bowl pits teams of undergraduate and graduate students from different universities in debates about ethical dilemmas archaeologists encounter during their work. Teams are given hypothetical cases and must use their academic knowledge of various ethical guidelines and laws, as well as their research and fieldwork experiences, to formulate and defend their solutions.

Teams are graded on their responses and their handling of “curveball” questions. The cases for this year’s bowl were on occupational safety and heritage management, colonial monuments and indigenous rights, looting and the antiquities trade, plagiarism, and funding for research and ethics training."

Westworld Spoilers Club season 2, episode 2: Reunion The second episode of the season drops subtle clues with big ramifications; The Verge, April 30, 2018

Bryan Bishop, The Verge; Westworld Spoilers Club season 2, episode 2: Reunion

The second episode of the season drops subtle clues with big ramifications


[SPOILERS BELOW]





"...[O]n the matter of the true agenda of the parks themselves, the episode’s revelations raise questions that the show will almost certainly have to engage. For 30 years, Delos parks have been secretly gathering data on their guests. How is that data being used? Have guests been blackmailed, extorted, or otherwise had the records of their trips used against them as futuristic, Wild West kompromat? And what would the corporate consequences be if the existence of such a project was made public? Given that Bernard was not giving proper access to the drone host lab, it seems evident that only people at the highest levels are aware of the data collection initiative, with non-networked hosts used in the facility to help cut down on the chance of leaks.

Given all that, Peter Abernathy — and the data he’s carrying in his head — becomes much more than just a moving plot device. He is quite literally the future of Delos, Inc. itself. Should he fall into the wrong hands, with the data collection initiative made public, it could take down the entire company. It’s a timely storyline, coming right at the time that online services like Facebook are facing more public scrutiny than ever. And no doubt that’s exactly what Joy and Nolan are aiming for."

Time for journalists to fight back, not play party hosts; The Washington Post, April 30, 2018

Dana Milbank, The Washington Post; Time for journalists to fight back, not play party hosts

"Olivier Knox, the incoming president, has said he wants to make the dinner “boring.”

How about better than boring? Move the dinner back a week, to honor World Press Freedom Day, and cancel the comedians. Instead, read the names of journalists killed doing their jobs over the year; people such as Daphne Caruana Galizia , who reported on government corruption in Malta, killed on Oct. 16, when the car she was driving exploded; and Miroslava Breach Velducea , who reported on politics and crime in Mexico, shot eight times and killed on March 23, 2017, when leaving her home with one of her children. Also, read the names of some jailed journalists and their time behind bars: Turkey’s Zehra Dogan, 323 days; Egypt’s Alaa Abdelfattah, 1,282 days ; China’s Ding Lingjie, 221 days; Kyrgyzstan’s Azimjon Askarov, 2,877 days; Congo’s Ghys Fortuné Dombé Bemba , 475 days.

Media companies and personalities, instead of hosting glitzy parties, would make contributions to and solicit funds for groups that protect the free press. And they would pledge to devote more air time and column inches to exposing abuses of press freedoms at home and abroad. The Post did this, successfully, during my colleague Jason Rezaian’s imprisonment in Iran. We should all pledge to be unabashed advocates: to shine light on the journalists languishing in prisons, the unsolved murders of journalists and the erosion of press freedom at home."

Monday, April 30, 2018

Google's Mysterious AI Ethics Board Should Be Transparent Like Axon's; Forbes, April 27, 2018

Sam Shead, Forbes; Google's Mysterious AI Ethics Board Should Be Transparent Like Axon's

"This week, Axon, a US company that develops body cameras for police officers and weapons for the law enforcement market, demonstrated the kind of transparency that Google should aspire towards when it announced an AI ethics board to "help guide the development of Axon's AI-powered devices and services".

Axon said the board's mission is to advise and guide Axon's leaders on the impact of AI technology on communities. The board will meet twice a year and it held its first meeting on Thursday in Scottsdale, Arizona.

"We believe the advancement of AI technology will empower police officers to connect with their communities versus being stuck in front of a computer screen doing data entry," said Axon CEO and founder, Rick Smith, in a statement. "We also believe AI research and technology for use in law enforcement must be done ethically and with the public in mind. This is why we've created the AI ethics board — to ensure any AI technology in public safety is developed responsibly.""

The 7 stages of GDPR grief; VentureBeat, April 29, 2018

Chris Purcell, VentureBeat; The 7 stages of GDPR grief

"All of the systems we’ve built around handling personal data will need to be re-engineered to handle the new General Data Protection Regulation (GDPR) rules that go into effect that day. That’s a lot to accomplish, with very little time left.

While the eve of the GDPR deadline may not start parties like we had back on New Year’s Eve 1999 — when people counted down to “the end of the world” — stakeholders in organizations across the globe will be experiencing a range of emotions as they make their way through the seven stages of GDPR grief at varying speeds.

Like Y2K, May 25 could come and go without repercussion if people work behind the scenes to make their organizations compliant. Unfortunately, most companies are in the earliest stage of grief – denial – believing that GDPR does not apply to them (if they even know what it is). Denial rarely serves companies well. And in the case of GDPR non-compliance, it could cost them fines of up to 20 million euros ($24 million) or four percent of global annual turnover, whichever value is greater.

Luckily, there are sure-tell signs for each grief stage and advice to help individuals and their employers move through each (and fast):..."

Saturday, April 28, 2018

Data on a genealogy site led police to the ‘Golden State Killer’ suspect. Now others worry about a ‘treasure trove of data’; The Washington Post, April 27, 2018

Justin JouvenalMark BermanDrew Harwell and Tom Jackman, The Washington Post; Data on a genealogy site led police to the ‘Golden State Killer’ suspect. Now others worry about a ‘treasure trove of data’

"Prosecutors say they see the private genealogical databases as an investigative gold mine, and they worry that privacy concerns could block them from the breakthroughs needed to track down future predators.

“Why in God’s name would we come up with a reason that we not be able to use it, on the argument that it intrudes onto someone’s privacy?” said Josh Marquis of the National District Attorneys Association. “Everything’s a trade-off. Obviously we want to preserve privacy. But on the other hand, if we’re able to use this technology without exposing someone’s deepest, darkest secrets, while solving these really horrible crimes, I think it’s a valid trade-off.”

Some legal experts compared the use of public genetic databases to the way authorities can scan other personal data provided to third-party sources, including telephone companies and banks. Others suggested further scrutiny as the amount of publicly available DNA multiplies.

“The law often lags behind where technology has evolved,” said Barbara McQuade, a University of Michigan law professor and former U.S. attorney. With DNA, “most of us have the sense that that feels very private, very personal, and even if you have given it up to one of these third-party services, maybe there should be a higher level of security.”"

Thursday, April 26, 2018

Facebook finally explains why it bans some content, in 27 pages; The Washington Post, April 24, 2018

Elizabeth Dwoskin and Tracy Jan, The Washington Post; Facebook finally explains why it bans some content, in 27 pages

"“We want people to know our standards, and we want to give people clarity,” Monika Bickert, Facebook’s head of global policy management, said in an interview. She added that she hoped publishing the guidelines would spark dialogue. “We are trying to strike the line between safety and giving people the ability to really express themselves.”"

The Guardian view on privacy online: a human right; The Guardian, April 26, 2018

Editorial Board, The Guardian; The Guardian view on privacy online: a human right

"Encryption on the internet will be abused, but better that than a society where no one is allowed secrets from the government"

Wednesday, April 25, 2018

In global AI race, Europe pins hopes on ethics; Politico, April 25, 2018

Janosch Delcker, Politico; 

In global AI race, Europe pins hopes on ethics


"One of the central goals in the EU strategy is to provide customers with insight into the systems.

That could be easier said than done.

“Algorithmic transparency doesn’t mean [platforms] have to publish their algorithms,” Ansip said, “but ‘explainability’ is something we want to get.”

AI experts say that to achieve such explainability, companies will, indeed, have to disclose the codes they’re using – and more.

Virginia Dignum, an AI researcher at the Delft University of Technology, said “transparency of AI is more than just making the algorithm transparent,” adding that companies should also have to disclose details such as which data was used to train their algorithms, which data are used to make decisions, how this data was collected, or at which point in the process humans were involved in the decision-making process."