Showing posts with label Big Tech. Show all posts
Showing posts with label Big Tech. Show all posts

Wednesday, December 27, 2023

Classical Musicians Victimized by Erroneous Copyright Claims; Violinist.com, December 19, 2023

 Laurie Niles, Violinist.com; Classical Musicians Victimized by Erroneous Copyright Claims

""One or more actions were applied to your video because of a copyright match."

This was just one of two copyright claims that Amy Beth Horman received from Facebook Thursday, disputing ownership of videos of her daughter's violin performances. First, she received a copyright claim for a video of Ava's live performance of the Mendelssohn Violin Concerto this week. Then, she got another for video she had posted in 2020 of then-10-year-old Ava performing "Meditation from Thais." These are both classical works that are in the public domain - not subject to copyright.

Nonetheless, classical musicians receive these kinds of dreaded messages on a regular basis if they post videos of their performances on social media outlets such as Facebook, Instagram or YouTube.

Has the musician violated anyone's copyright? Almost never. These are automated copyright claims created by bots on behalf of big companies like Sony Music Entertainment, Warner Music Group or Universal Music. If the bot finds that your performance has approximately the same notes and timing as one in their catalogue, they then claim that they own rights to your recording. But musicians have every right to perform and post a public domain work. Even so, musicians often find their recordings muted, earnings from ads on their performances given instead to the company filing the erroneous claim, and threats of having their accounts suspended or banned."

Thursday, December 14, 2023

Big Tech funds the very people who are supposed to hold it accountable; The Washington Post, December 7, 2023

, The Washington Post; Big Tech funds the very people who are supposed to hold it accountable

"“Big Tech has played this game really successfully in the past decade,” said Lawrence Lessig, a Harvard Law School professor who previously founded Stanford’s Center for Internet and Society without raising money outside the university. “The number of academics who have been paid by Facebook alone is extraordinary.”

Most tech-focused academics say their work is not influenced by the companies, and the journals that publish their studies have ethics rules designed to ward off egregious interference. But in interviews, two dozen professors said that by controlling funding and access to data, tech companies wield “soft power,” slowing down research, sparking tension between academics and their institutions, and shifting the fields’ targets in small — but potentially transformative — ways...

Harvard’s Lessig, who spent years heading a center on ethics issues in society at the university, is developing a system for academics to verify that their research is truly independent. He hopes to present the initiative, the Academic Integrity Project, to the American Academy of Arts and Sciences.

He is still looking for funding."

Monday, October 30, 2023

Biden plans to step up government oversight of AI with new 'pressure tests'; NPR, October 30, 2023

 , NPR; Biden plans to step up government oversight of AI with new 'pressure tests'

"President Biden on Monday will take sweeping executive action to try to establish oversight of the rapidly evolving artificial intelligence sector, setting new standards for safety tests for AI products – as well as a system for federal "pressure tests" of major systems, White House chief of staff Jeff Zients told NPR.

Months in the making, the executive order reflects White House concerns that the technology, left unchecked, could pose significant risks to national security, the economy, public health and privacy. The announcement comes just days ahead of a major global summit on AI taking place in London, which Vice President Harris will attend."

Thursday, October 26, 2023

How Americans View Data Privacy; Pew Research Center; Pew Research Center, October 18, 2023

COLLEEN MCCLAINMICHELLE FAVERIOMONICA ANDERSON AND EUGENIE PARK, Pew Research Center; How Americans View Data Privacy

"In an era where every click, tap or keystroke leaves a digital trail, Americans remain uneasy and uncertain about their personal data and feel they have little control over how it’s used.

This wariness is even ticking up in some areas like government data collection, according to a new Pew Research Center survey of U.S. adults conducted May 15-21, 2023.

Today, as in the past, most Americans are concerned about how companies and the government use their information. But there have been some changes in recent years:"

Saturday, October 14, 2023

Inside the Israel-Hamas information war, from insider attacks to fleeing leaders; The New York Times, October 14, 2023

, The New York Times; Inside the Israel-Hamas information war, from insider attacks to fleeing leaders

"While social media has been a critical tool for disseminating wartime information in recent days, a barrage of images, memes and testimonials is making it difficult to assess what is real...

“Right now,” said Hultquist, “it’s very difficult for a lay person to get to ground truth.”"

Thursday, September 14, 2023

In Show of Force, Silicon Valley Titans Pledge ‘Getting This Right’ With A.I.; The New York Times, September 13, 2023

Cecilia Kang, The New York Times ; In Show of Force, Silicon Valley Titans Pledge ‘Getting This Right’ With A.I.

"Elon Musk warned of civilizational risks posed by artificial intelligence. Sundar Pichai of Google highlighted the technology’s potential to solve health and energy problems. And Mark Zuckerberg of Meta stressed the importance of open and transparent A.I. systems.

The tech titans held forth on Wednesday in a three-hour meeting with lawmakers in Washington about A.I. and future regulations. The gathering, known as the A.I. Insight Forum, was part of a crash course for Congress on the technology and organized by the Senate leader, Chuck Schumer, Democrat of New York.

The meeting — also attended by Bill Gates, a founder of Microsoft; Sam Altman of OpenAI; Satya Nadella of Microsoft; and Jensen Huang of Nvidia — was a rare congregation of more than a dozen top tech executives in the same room. It amounted to one of the industry’s most proactive shows of force in the nation’s capital as companies race to be at the forefront of A.I. and to be seen to influence its direction."

Friday, July 28, 2023

Lindsey Graham and Elizabeth Warren: When It Comes to Big Tech, Enough Is Enough; The New York Times, July 27, 2023

Lindsey Graham and  , The New York Times; Lindsey Graham and Elizabeth Warren: When It Comes to Big Tech, Enough Is Enough

"For more than a century, Congress has established regulatory agencies to preserve innovation while minimizing harm presented by emerging industries. In 1887 the Interstate Commerce Commission took on railroads. In 1914 the Federal Trade Commission took on unfair methods of competition and later unfair and deceptive acts and practices. In 1934 the Federal Communications Commission took on radio (and then television). In 1975 the Nuclear Regulatory Commission took on nuclear power, and in 1977 the Federal Energy Regulatory Commission took on electricity generation and transmission. We need a nimble, adaptable, new agency with expertise, resources and authority to do the same for Big Tech.

Our Digital Consumer Protection Commission Act would create an independent, bipartisan regulator charged with licensing and policing the nation’s biggest tech companies — like Meta, Google and Amazon — to prevent online harm, promote free speech and competition, guard Americans’ privacy and protect national security. The new watchdog would focus on the unique threats posed by tech giants while strengthening the tools available to the federal agencies and state attorneys general who have authority to regulate Big Tech.

Our legislation would guarantee common-sense safeguards for everyone who uses tech platforms."

Saturday, July 15, 2023

'Not for Machines to Harvest’: Data Revolts Break Out Against A.I.; The New York Times, July 15, 2023

Sheera Frenkel and , The New York Times;  'Not for Machines to Harvest’: Data Revolts Break Out Against A.I.

"At the heart of the rebellions is a newfound understanding that online information — stories, artwork, news articles, message board posts and photos — may have significant untapped value.

The new wave of A.I. — known as “generative A.I.” for the text, images and other content it generates — is built atop complex systems such as large language models, which are capable of producing humanlike prose. These models are trained on hoards of all kinds of data so they can answer people’s questions, mimic writing styles or churn out comedy and poetry...

“What’s happening here is a fundamental realignment of the value of data,” said Brandon Duderstadt, the founder and chief executive of Nomic, an A.I. company...

“The data rebellion that we’re seeing across the country is society’s way of pushing back against this idea that Big Tech is simply entitled to take any and all information from any source whatsoever, and make it their own,” said Ryan Clarkson, the founder of Clarkson...

Eric Goldman, a professor at Santa Clara University School of Law, said the lawsuit’s arguments were expansive and unlikely to be accepted by the court. But the wave of litigation is just beginning, he said, with a “second and third wave” coming that would define A.I.’s future."

Friday, April 22, 2022

Barack Obama Takes On a New Role: Fighting Disinformation; The New York Times, April 21, 2022

 Steven Lee Myers and  , The New York Times; Barack Obama Takes On a New Role: Fighting Disinformation

The former president has embarked on a campaign to warn that the scourge of online falsehoods has eroded the foundations of democracy.

"Mr. Obama’s approach to the issue has been characteristically deliberative. He has consulted the chief executives of Apple, Alphabet and others. Through the Obama Foundation in Chicago, he has also met often with the scholars the foundation has trained; they recounted their own experiences with disinformation in a variety of fields around the world.

From those deliberations, potential solutions have begun taking shape, a theme he plans to outline broadly on Thursday. While Mr. Obama maintains that he remains “close to a First Amendment absolutist,” he has focused on the need for greater transparency and regulatory oversight of online discourse — and the ways companies have profited from manipulating audiences through their proprietary algorithms."

Friday, April 15, 2022

Tim Cook delivers speech railing against “data industrial complex,” sideloading; Ars Technica, April 12, 202


"Apple CEO Tim Cook took to the stage at the annual International Association of Privacy Professionals (IAPP) conference on Tuesday to talk about privacy, security, ad tracking, and sideloading.

Calling privacy "one of the most essential battles of our time," Cook lambasted companies that monetize large user-data collection operations, comparing them to real-world stalkers."

Thursday, March 24, 2022

E.U. Takes Aim at Big Tech’s Power With Landmark Digital Act; The New York Times, March 24, 2022

Adam Satariano, The New York Times ; E.U. Takes Aim at Big Tech’s Power With Landmark Digital Act

The European Union was expected to finalize the Digital Markets Act, the most sweeping legislation to regulate tech since a European privacy law was passed in 2018.

"Officials on Thursday were putting the finishing touches on the law, called the Digital Markets Act, which would be the most sweeping piece of digital policy since the bloc put the world’s toughest rules to protect people’s online data into effect in 2018. The legislation is aimed at stopping the largest tech platforms from using their interlocking services and considerable resources to box in users and squash emerging rivals, creating room for new entrants and fostering more competition.

What that means practically is that companies like Google could no longer collect data from different services to offer targeted ads without users’ consent and that Apple might have to allow alternatives to its App Store on iPhones and iPads. Violators of the law, which would most likely take effect early next year, could face significant fines.

The Digital Markets Act is part of a one-two punch by European regulators. As early as next month, the European Union is expected to reach an agreement on a law that would force social media companies such as Meta, the owner of Facebook and Instagram, to police their platforms more aggressively."

Wednesday, March 23, 2022

The ex-Google researcher staring down Big Tech; Politico, March 18, 2022

BRAKKTON BOOKER , Politico; The ex-Google researcher staring down Big Tech

"THE RECAST:  President Biden ran on a platform promising to root out inequities in federal agencies and programs. Has his administration done enough to tackle the issue of discriminatory AI?

GEBRU: I'm glad to see that some initiatives have started. I like that the Office Of Science And Technology Policy (OSTP), for instance, is filled with people I respect, like Alondra Nelson, who is now its head.

My biggest comment on this is that a lot of tech companies — all tech companies — actually, don't have to do any sort of test to prove that they're not putting out harmful products...

The burden of proof always seems to be on us...The burden of proof should be on these tech companies."

Monday, March 14, 2022

Sandy Hook review: anatomy of an American tragedy – and the obscenity of social media; The Guardian, March 13, 2022

 , The Guardian; Sandy Hook review: anatomy of an American tragedy – and the obscenity of social media

"Those recommendations are the result of the infernal algorithms which are at the heart of the business models of Facebook and YouTube and are probably more responsible for the breakdown in civil society in the US and the world than anything else invented.

“We thought the internet would give us this accelerated society of science and information,” says Lenny Pozner, whose son Noah was one of the Sandy Hook victims. But “really, we’ve gone back to flat earth”."

Monday, February 28, 2022

How to avoid falling for and spreading misinformation about Ukraine; The Washington Post, February 24, 2022

Heather Kelly, The Washington Post ; How to avoid falling for and spreading misinformation about Ukraine

"Anyone with a phone and an Internet connection is able to watch the war in Ukraine unfold live online, or at least some version of it. Across social media, posts are flying up faster than most fact-checkers and moderators can handle, and they’re an unpredictable mix of true, fake, out of context and outright propaganda messages.

How do you know what to trust, what not to share and what to report? Tech companies have said they’re trying to do more to help users spot misinformation about Ukraine, with labels and fact checking. On Saturday, Facebook parent company Meta announced it was adding more fact-checkers in the region dedicated to posts about the war. It’s also warning users who attempt to share war-related photo when they’re more than a year old — a common type of misinformation.

Here are some basic tools everyone should use when consuming breaking news online."

Saturday, February 19, 2022

AirTags are being used to track people and cars. Here's what is being done about it; NPR, February 18, 2022

MICHAEL LEVITT, NPR; AirTags are being used to track people and cars. Here's what is being done about it

""As technology becomes more sophisticated and advanced, as wonderful as that is for society, unfortunately, it also becomes much easier to misuse and abuse," she told NPR. "I wouldn't say that we've necessarily seen an uptick with the use of AirTags any more or less than any cutting edge technology."

Williams said that what was rare was a technology company taking the issue seriously and moving to address it.

"[Apple is] not only listening to the field, but actively reaching out at times to do safety checks. That in and of itself might sound like a very small step, but it's rare," she said.

Still, Galperin thinks that Apple should have done more to protect people ahead of time. 

"The mitigations that Apple had in place at the time that the AirTag came out were woefully insufficient," Galperin said. 

"I think that Apple has been very careful and responsive after putting the product out and introducing new mitigations. But the fact that they chose to bring the product to market in the state that it was in last year, is shameful.""

Friday, February 18, 2022

U.S. Copyright Office Consultation Triggers Massive “Upload Filter” Opposition; TorrentFreak, February 16, 2022

Ernesto Van der Sar, TorrentFreak; U.S. Copyright Office Consultation Triggers Massive “Upload Filter” Opposition

"Late 2020, Senator Thom Tillis released a discussion draft of the “Digital Copyright Act” (DCA), which aims to be a successor to the current DMCA.

The DCA hints at far-reaching changes to the way online intermediaries approach the piracy problem. Among other things, these services would have to ensure that pirated content stays offline after it’s taken down once.

This “takedown and staydown’ approach relies on technical protection tools, which include upload filters. This is a sensitive subject that previously generated quite a bit of pushback when the EU drafted its Copyright Directive.

To gauge the various options and viewpoints, the Copyright Office launched a series of consultations on the various technical tools that can help to detect and remove pirated content from online platforms.

This effort includes a public consultation where various stakeholders and members of the public were invited to share their thoughts, which they did en masse."

Tuesday, February 15, 2022

Opinion: A lawsuit against Google points out a much bigger privacy problem; The Washington Post, February 14, 2022

Editorial Board, The Washington Post; Opinion: A lawsuit against Google points out a much bigger privacy problem

"The phenomenon the recent suits describe, after all, is not particular to Google but rather endemic to almost the entirety of the Web: Companies get to set all the rules, as long as they run those rules by consumers in convoluted terms of service that even those capable of decoding the legalistic language rarely bother to read. Other mechanisms for notice and consent, such as opt-outs and opt-ins, create similar problems. Control for the consumer is mostly an illusion. The federal privacy law the country has sorely needed for decades would replace this old regime with meaningful limitations on what data companies can collect and in what contexts, so that the burden would be on them not to violate the reasonable expectations of their users, rather than placing the burden on the users to spell out what information they will and will not allow the tech firms to have.

The question shouldn’t be whether companies gather unnecessary amounts of sensitive information about their users sneakily — it should be whether companies amass these troves at all. Until Congress ensures that’s true for the whole country, Americans will be clicking through policies and prompts that do little to protect them."

Thursday, February 10, 2022

The Sandy Hook Father Who Refused to Let Alex Jones Win; The New York Times, February 10, 2022

Kara Swisher, The New York Times; The Sandy Hook Father Who Refused to Let Alex Jones Win

Conspiracy theories have loomed over the school shooting in which his son, Noah, died. Leonard Pozner reflects on how the truth can triumph online.


"I’m Kara Swisher, and you’re listening to “Sway.” Today I want to talk about how the information age has become the misinformation age. From Covid deniers to QAnon enthusiasts and big lie believers, it sometimes feels like we live in a post-truth world. In fact, it feels like we already live there all the time now.

My guest today is no stranger to that. Leonard Pozner is the father of Noah, who, in 2012, at only age six, was murdered at the school shooting at Sandy Hook Elementary. Noah was one of 20 children and six educators who lost their lives in that massacre. And while much of America mourned the tragedy, some did not. Rumors abounded online, calling Sandy Hook a hoax. Amongst the chief conspiracists, conservative talk show host and founder of Infowars, Alex Jones."

Thursday, January 27, 2022

Stephen G. Breyer may shape tech’s copyright battles for years to come; The Washington Post, January 27, 2022

Cristiano Lima with research by Aaron Schaffer, The Washington Post; Stephen G. Breyer may shape tech’s copyright battles for years to come

"Stephen G. Breyer may shape tech’s copyright battles for years to come

With the looming retirement of Supreme Court Justice Stephen G. Breyer, tech policy wonks say the high court is losing one of the nation’s preeminent thought leaders on intellectual property and copyright.

But while Breyer may be on his way out of federal court, his influence over those standards, and how they map onto emerging technologies, is poised to live on long after.

For decades, Breyer has carved out a unique role on the bench as a copyright specialist, said Meredith Rose, senior policy counsel at consumer group Public Knowledge. And his advocacy for a more limited view of intellectual property rights than some of his colleagues, such as the late Justice Ruth Bader Ginsburg, made him a “rarity” in the space, Rose said. 

“He’s definitely got the biggest depth of experience in copyright issues on the bench currently,” she said. “It was really him and Justice Ginsburg were the two titans of copyright.”

Corynne McSherry, legal director at the Electronic Frontier Foundation, called Breyer “a very strong voice for a balanced intellectual property system” that ensured that copyright and patents are “encouraging innovation, encouraging new creativity … as opposed to thwarting it.”

These traits, they said, were exemplified in one of Breyer’s most recent high-profile copyright cases: the contentious, decade-long Google v. Oracle bout."

Sunday, January 23, 2022

The Humanities Can't Save Big Tech From Itself; Wired, January 12, 2022

, Wired; The Humanities Can't Save Big Tech From Itself

 "I’ve been studying nontechnical workers in the tech and media industries for the past several years. Arguments to “bring in” sociocultural experts elide the truth that these roles and workers already exist in the tech industry and, in varied ways, always have. For example, many current UX researchers have advanced degrees in sociology, anthropology, and library and information sciences. And teachers and EDI (Equity, Diversity, and Inclusion) experts often occupy roles in tech HR departments.

Recently, however, the tech industry is exploring where nontechnical expertise might counter some of the social problems associated with their products. Increasingly, tech companies look to law and philosophy professors to help them through the legal and moral intricacies of platform governance, to activists and critical scholars to help protect marginalized users, and to other specialists to assist with platform challenges like algorithmic oppression, disinformation, community management, user wellness, and digital activism and revolutions. These data-driven industries are trying hard to augment their technical know-how and troves of data with social, cultural, and ethical expertise, or what I often refer to as “soft” data.

But you can add all of the soft data workers you want and little will change unless the industry values that kind of data and expertise. In fact, many academics, policy wonks, and other sociocultural experts in the AI and tech ethics space are noticing a disturbing trend of tech companies seeking their expertise and then disregarding it in favor of more technical work and workers...

Finally, though the librarian profession is often cited as one that might save Big Tech from its disinformation dilemmas, some in LIS (Library and Information Science) argue they collectively have a long way to go before they’re up to the task. Safiya Noble noted the profession’s (just over 83% white) “colorblind” ideology and sometimes troubling commitment to neutrality. This commitment, the book Knowledge Justice explains, leads to many librarians believing, “Since we serve everyone, we must allow materials, ideas, and values from everyone.” In other words, librarians often defend allowing racist, transphobic, and other harmful information to stand alongside other materials by saying they must entertain “all sides” and allow people to find their way to the “best” information. This is the exact same error platforms often make in allowing disinformation and abhorrent content to flourish online."