Charlie Warzel , The Atlantic; YouTube Bends the Knee
"This is just the latest example of major tech companies bowing to Trump."
The Ebook version of my Bloomsbury book "Ethics, Information, and Technology" will be published on December 11, 2025 and the Hardback and Paperback versions will be available on January 8, 2026. Preorders are available via Amazon and this Bloomsbury webpage: https://www.bloomsbury.com/us/ethics-information-and-technology-9781440856662/
Charlie Warzel , The Atlantic; YouTube Bends the Knee
"This is just the latest example of major tech companies bowing to Trump."
Priya Bharadia , The Guardian; Cat soap operas and babies trapped in space: the ‘AI slop’ taking over YouTube
"One expert said AI video generators herald the next wave of internet “enshittification”, a term first used by the British-Canadian author Cory Doctorow. Coined in 2022, Doctorow used it to describe the decline in quality of users’ online experiences, as platforms prioritise profit over offering high-quality content.
“AI slop is flooding the internet with content that essentially is garbage,” said Dr Akhil Bhardwaj, an associate professor at the University of Bath’s school of management. “This enshittification is ruining online communities on Pinterest, competing for revenue with artists on Spotify and flooding YouTube with poor quality content.”
“One way for social media companies to regulate AI slop is to ensure that it cannot be monetised, thus stripping away the incentive for generating it.”
Ryan Broderick, the author of the popular Garbage Day newsletter on internet culture, is scathing about the impact of AI video, writing last week that YouTube has become a “dumping ground for disturbing, soulless AI shorts”.
Nico Grant and Tripp Mickle, The New York Times; YouTube Pirates Are Cashing In on Hollywood’s Summer Blockbusters
"But the company also had cause to be concerned. In the days after the Disney film’s opening, a pirated version of “Lilo & Stitch” proved to be a hit on YouTube, where more than 200,000 people viewed it, potentially costing Disney millions of dollars in additional sales, according to new research from Adalytics, a firm that analyzes advertising campaigns for brands.
The findings of the research shed new light on the copyright issues that once threatened to upend YouTube’s business. They also show how advertisers have unwittingly supported illicit content on YouTube, and they provide rare data about piracy on the platform."
, TechCrunch ; In AI copyright case, Zuckerberg turns to YouTube for his defense
Nico Grant , The New York Times; Election Falsehoods Take Off on YouTube as It Looks the Other Way
"From May through August, researchers at Media Matters tracked 30 of the most popular YouTube channels they identified as persistently spreading election misinformation, to analyze the narratives they shared in the run-up to November’s election.
The 30 conservative channels posted 286 videos containing election misinformation, which racked up more than 47 million views. YouTube generated revenue from more than a third of those videos by placing ads before or during them, researchers found. Some commentators also made money from those videos and other monetized features available to members of the YouTube Partner Program...
Mr. Giuliani, the former New York mayor, posted more false electoral claims to YouTube than any other major commentator in the research group, the analysis concluded...
YouTube, which is owned by Google, has prided itself on connecting viewers with “authoritative information” about elections. But in this presidential contest, it acted as a megaphone for conspiracy theories."
Dan Milmo , The Guardian; YouTube to offer option to flag AI-generated songs that mimic artists’ voices
"Record companies can request the removal of songs that use artificial intelligence-generated versions of artists’ voices under new guidelines issued by YouTube.
The video platform is introducing a tool that will allow music labels and distributors to flag content that mimics an artist’s “unique singing or rapping voice”.
Fake AI-generated music has been one of the side-effects of leaps forward this year in generative AI – the term for technology that can produce highly convincing text, images and voice from human prompts.
One of the most high-profile examples is Heart on My Sleeve, a song featuring AI-made vocals purporting to be Drake and the Weeknd. It was pulled from streaming services after Universal Music Group, the record company for both artists, criticised the song for “infringing content created with generative AI”. However, the song can still be accessed by listeners on YouTube."
Maya Yang , The Guardian; California police department investigates officers blaring Disney music
"A California police department has launched an investigation into its own officers who were filmed blaring copyrighted Disney music in attempts to prevent residents from recording them...
The incident reflects an apparently growing trend in which police officers play copyrighted music in order to prevent videos of them from being posted on to social media platforms such as YouTube and Instagram, which can remove content that includes unauthorized content."
Charles Kaiser , The Guardian; Sandy Hook review: anatomy of an American tragedy – and the obscenity of social media
"Those recommendations are the result of the infernal algorithms which are at the heart of the business models of Facebook and YouTube and are probably more responsible for the breakdown in civil society in the US and the world than anything else invented.
“We thought the internet would give us this accelerated society of science and information,” says Lenny Pozner, whose son Noah was one of the Sandy Hook victims. But “really, we’ve gone back to flat earth”."
"While Ms. Howerton and her supporters report Twitter accounts for abuse, she is also asking YouTube to take down the video commentary that makes use of her video and other family images. She has filed a privacy complaint, which YouTube rejected, and is waiting for it to respond to her new complaint, alleging copyright violation. Neil Richards, a law professor at Washington University and author of “Intellectual Privacy: Rethinking Civil Liberties in the Digital Age,” said he thinks Ms. Howerton’s belief that she can regain control of the footage may be overly optimistic. “The use of home video and family images for political debate is something that has real consequences,” he said. “She has made her life choices, her experiences, her children’ experiences, a matter for public debate. When people do this they do expose themselves to criticism and attacks and some of them are quite unpleasant.” Eric Goldman, a professor of law and director of the High Tech Law Institute at Santa Clara University School of Law, agreed that because Ms. Howerton herself used family video as part of a political discussion, she may have little legal recourse when that video is used as part of a larger video engaged in social commentary on the same topic. In many situations, videos or pictures posted online can become “fair game” for critics to use in online attacks against the poster’s position or for other undesirable political or social statements, Mr. Goldman said in an email."
"The challenge of chasing down copyright infringers has led content owners, in general, to claim the safe harbor rules are too lax, and that platforms like YouTube should do more to take down unauthorized videos. Studios have filed a spate of lawsuits to argue that more websites should be liable under a “red flag” provision in the copyright law, which can strip a site’s legal immunity in the event they obviously should have known about the infringement, or if they are directly making money from it. But so far those lawsuits, including a long-running one against YouTube, have not really changed websites’ responsibilities when it comes to copyright, according to Lothar Determann, a copyright lawyer with Baker & McKenzie in San Francisco. He added more broadly that the law’s larger goal of protecting tech platforms still applies, and courts will not order websites to conduct copyright investigations. The freebooter issue for Facebook, then, appears to be less of a legal problem than a moral one. Video owners may come to blame Facebook – safe harbors notwithstanding – for using their content to get rich while flouting their copyright concerns. Such claims, whether fair or not, have dogged Google and YouTube for years, and led to legal and political headaches."
"In a complicated legal battle that touches on questions of free speech, copyright law and personal safety, a federal appeals court has overturned an order that had forced the Google-owned YouTube to remove an anti-Muslim video from its website last year. Both of the recent decisions about the controversial "Innocence Of Muslims" video originated with the 9th U.S. Circuit Court of Appeals. Last year, a three-judge panel agreed with actress Cindy Lee Garcia's request to have the film taken down from YouTube on the basis of a copyright claim. But Monday, the full en banc court rejected Garcia's claim. "The appeal teaches a simple lesson — a weak copyright claim cannot justify censorship in the guise of authorship," Circuit Judge M. Margaret McKeown wrote in the court's opinion."