Showing posts with label Big Tech. Show all posts
Showing posts with label Big Tech. Show all posts

Sunday, March 29, 2026

Meta’s court losses spell potential trouble for AI research, consumer safety; CNBC, March 29, 2026

Jonathan Vanian , CNBC; Meta’s court losses spell potential trouble for AI research, consumer safety

"Over a decade ago, Meta then known as Facebook – hired social science researchers to analyze how the social network’s services were affecting users. It was a way for the company and its peers to show they were serious about understanding the benefits and potential risks of their innovations. 

But as Meta’s court losses this week illustrate, the researchers’ work can become a liability. Brian Boland, a former Facebook executive who testified in both trials — one in New Mexico and the other in Los Angeles — says the damning findings from Meta’s internal research and documents seemed to contradict the way the company portrayed itself publicly. Juries in the two trials determined that Meta inadequately policed its site, putting kids in harm’s way. 

Mark Zuckerberg’s company began clamping down on its research teams a few years ago after a Facebook researcher, Frances Haugen, became a prominent whistleblower. The newer crop of tech companies, like OpenAI and Anthropic, subsequently invested heavily in researchers and charged them with studying the impact of modern AI on users and publishing their findings. 

With AI now getting outsized attention for the harmful effects it’s having on some users, those companies must ask if it’s in their best interest to continue funding research or to suppress it."

Thursday, March 26, 2026

The Terrible Cost of the Infinite Scroll; The New York Times, March 26, 2026

 , The New York Times; The Terrible Cost of the Infinite Scroll

"It finally happened: Social media companies have been held accountable for the toxicity of their algorithmic grip.

In a first ruling of its kind, a California Superior Court jury found Wednesday that Meta and YouTube harmed a user through their addictive design choices.

The consequences for the industry could be significant. This case is only one of thousands set to be litigated across the country, and courts are seeking to consolidate them. This could wind up with a single significant settlement similar to the agreement that the four largest cigarette makers made in 1998 to resolve lawsuits for an estimated $206 billion as part of a master agreement with 46 states.

Compensating people for the harm caused by their products is just the silver lining. The real win would be if the social media giants were finally forced to design less harmful products."

Is Big Tech Facing a Big Tobacco Moment?; The New York Times, March 26, 2026

Andrew Ross SorkinBernhard WarnerSarah KesslerMichael J. de la MercedNiko Gallogly,Brian O’Keefe and , The New York Times; Is Big Tech Facing a Big Tobacco Moment?

Back-to-back courtroom losses have put technology giants, including Meta and Google, in uncertain territory as they face lawsuits and bans on teen users.

"Andrew here. Back in 2018, I moderated a panel at the World Economic Forum that included Marc Benioff of Salesforce. It was then that he essentially declared that Facebook was the modern-day equivalent of cigarettes, and that it and other social media companies should be regulated as such.

Well, Meta’s loss in court on Wednesday, in a case about whether its platforms were designed to be addictive to adolescents, may be a watershed. Investors don’t seem to be fazed — the company’s shares hardly moved after the verdict came out — but the decision could change the conversation around the company yet again. More below...

Some legal experts wonder if Big Tech is staring at a Big Tobacco moment, a reference to how cigarette makers had to overhaul their businesses — at a huge expense — after courts ruled that some of their products were addictive and harmful.

We’re in a new era, a digital era, where we have to rethink definitions for products based on which entities might have superior information to prevent these injuries and accidents,” Catherine Sharkey, a professor of law at N.Y.U., told The Times. She added that the “implications” of those verdicts were “very, very big.”

“This has potentially large impacts on other areas in tech, A.I. and beyond that,” Jessica Nall, a San Francisco lawyer who represents tech companies and executives, told The Wall Street Journal. “The floodgates are already open.”

Meta and Google plan to appeal. The companies have signaled that they will fight efforts to make them drastically redesign their products and algorithms."

Juries Take the Lead in the Push for Child Online Safety; The New York Times, March 26, 2026

, The New York Times; Juries Take the Lead in the Push for Child Online Safety

A pair of verdicts held social media companies accountable for harming young users, highlighting a growing backlash as Congress struggles to pass legislation.

"But this week, two juries held social media companies accountable for harming young users.

In Los Angeles on Wednesday, a jury decided in favor of a plaintiff who had claimed that Meta and YouTube hooked her with addictive features — a verdict validating a novel legal strategy holding the companies accountable for personal injury. And a day earlier in New Mexico, a jury found Meta liable for violating state law by failing to safeguard users of its apps from child predators.

The landmark decisions highlight a growing backlash against social media and its effects on young people, including criticism from parents and policymakers around the globe that it is contributing to a youth mental health crisis. And they show that the push for change may finally be gaining steam.

U.S. lawmakers said on Wednesday that the verdicts underscored the need for child safety legislation. Senators Marsha Blackburn, Republican of Tennessee, and Richard Blumenthal, Democrat of Connecticut, called for legislators to pass their bill, the Kids Online Safety Act.

Federal momentum would build on laws in more than 30 states banning phones in schools. Globally, Australia in December banned social media for those under 16. Spain, Denmark, France, Malaysia and Indonesia are considering similar restrictions."

Wednesday, March 25, 2026

Meta and YouTube Found Negligent in Landmark Social Media Addiction Case; The New York Times, March 25, 2026

Cecilia KangRyan Mac and , The New York Times ; Meta and YouTube Found Negligent in Landmark Social Media Addiction Case

A jury found the companies negligent in their app designs, harming a young user with design features that were addictive and led to her mental health distress.

"The social media company Meta and the video streaming service YouTube harmed a young user with design features that were addictive and led to her mental health distress, a jury found on Wednesday, a landmark decision that could open social media companies to more lawsuits over users’ well-being.

Meta and YouTube must pay $3 million in compensatory damages for pain and suffering and other financial burdens. Meta is responsible for 70 percent of that cost and YouTube for the remainder.

The bellwether case, which was brought by a now 20-year-old woman identified as K.G.M., had accused social media companies of creating products as addictive as cigarettes or digital casinos. K.G.M. sued Meta, which owns Instagram and Facebook, and Google’s YouTube over features like infinite scroll and algorithmic recommendations that she claimed led to anxiety and depression.

The jury of seven women and five men will deliberate further to decide what punitive damages the companies should pay for malice or fraud."

Sunday, March 15, 2026

Social Media Isn’t Just Speech. It’s Also a Defective, Hazardous Product.; The New York Times, March 14, 2026

, The New York Times ; Social Media Isn’t Just Speech. It’s Also a Defective, Hazardous Product.

"For two decades now, social media companies have been virtually untouchable, profitably floating above accusations that they normalize propaganda, addict children and degrade our character. Legally and politically, platforms like Facebook, Instagram and YouTube have been protected by an idea that they and others have promoted: that they are not just innovative technologies but also speech platforms, so that imposing any limits on them would amount to both censorship and a drag on technological progress.

That protection is finally starting to weaken, thanks to a growing realization that social media is also a matter of public health. Seen this way, social media appears as something less newfangled and more familiar: a defective, hazardous product. The current trial of Meta’s Instagram and Google’s YouTube in Los Angeles Superior Court, in which a 20-year-old woman has accused the platforms of designing their products in ways that harmed her mental and physical health, is the clearest sign of this shift.

The case, in which closing arguments were made on Thursday, is the first of many lawsuits brought by thousands of young people, school districts and state attorneys general against companies like Meta, Google, Snap and TikTok. The plaintiffs in these cases do not accuse the companies merely of serving up bad content to young people; they argue that the very design of social media is intentionally engineered to create compulsions and habits of overuse, regardless of the content provided."

Saturday, March 14, 2026

The Guardian view on changes to copyright laws: authors should be protected over big tech; The Guardian, March 13, 2026

  , The Guardian; The Guardian view on changes to copyright laws: authors should be protected over big tech

"In a scene that might have come from a dystopian novel, books were being stamped with “Human Authored” logos at this week’s London Book Fair. The Society of Authors described its labelling scheme as “an important sticking plaster to protect and promote human creativity in lieu of AI labelled content in the marketplace”.

Visitors to the fair were also being given copies of Don’t Steal This Book, an anthology of about 10,000 writers including Nobel laureate Kazuo Ishiguro, Malorie Blackman, Jeanette Winterson and Richard Osman, in which the pages are completely blank. The back cover states: “The UK government must not legalise book theft to benefit AI companies.” The message is clear: writers have had enough.

The fair comes the week before the government is due to deliver its progress report on AI and copyright, after proposals for a relaxation of existing laws caused outrage last year. Philippa Gregory, the novelist, described the plans for an “opt-out” policy, which puts the onus on writers to refuse permission for their work to be trawled, as akin to putting a sign on your front door asking burglars to pass by...

House of Lords report published last week lays out two possible futures: one in which the UK “becomes a world-leading home for responsible, legalised artificial intelligence (AI) development” and another in which it continues “to drift towards tacit acceptance of large-scale, unlicensed use of creative content”. One scenario protects UK artists, the other benefits global tech companies. To avoid a world of empty content, the choice is clear."

Anthropic-Pentagon battle shows how big tech has reversed course on AI and war; The Guardian, March 13, 2026

 , The Guardian; Anthropic-Pentagon battle shows how big tech has reversed course on AI and war

"The standoff between Anthropic and the Pentagon has forced the tech industry to once again grapple with the question of how its products are used for war – and what lines it will not cross. Amid Silicon Valley’s rightward shift under Donald Trump and the signing of lucrative defense contracts, big tech’s answer is looking very different than it did even less than a decade ago."

Wednesday, March 11, 2026

Democrats ask what happened to millions earmarked for Trump’s library; The Washington Post, March 11, 2026

 , The Washington Post; Democrats ask what happened to millions earmarked for Trump’s library

ABC, Meta, Paramount and X reportedly agreed to pay at least $63 million in settlements with the president. The original fund was dissolved last year.

"Congressional Democrats are opening a probe into millions of dollars private companies pledged to President Donald Trump’s planned presidential library, asking what happened to the money after the original fund was dissolved last year.

Sens. Elizabeth Warren (Massachusetts) and Richard Blumenthal (Connecticut) and Rep. Melanie Stansbury (New Mexico) wrote Monday to the leaders of ABC, Meta, Paramount and X, requesting information about the terms of their agreements and the status of the funds they pledged to hand over to the president’s representatives. The letters were shared with The Washington Post."

Quit ChatGPT: right now! Your subscription is bankrolling authoritarianism; The Guardian, March 4, 2026

, The Guardian ; Quit ChatGPT: right now! Your subscription is bankrolling authoritarianism

"OpenAI, the company behind ChatGPT, is on track to lose $14bn this year. Its market share is collapsing, and its own CEO, Sam Altman, has admitted it “screwed up” an element of the product. All it takes to accelerate that decline is 10 seconds of your time.

A grassroots boycott called QuitGPT has been spreading across the US and beyond, asking people to cancel their ChatGPT subscriptions. More than a million people have answered the call. Mark Ruffalo and Katy Perry have thrown their weight behind it. It is one of the most significant consumer boycotts in recent memory, and I believe it’s time for Europeans to join...

In contrast, cancelling ChatGPT is a piece of cake. You can do it in 10 seconds, and the alternatives are just as good or even better. History shows why #QuitGPT has so much potential: effective campaigns such as the 1977 Nestlé boycott and the 2023 Bud Light boycott were successful because they were narrow and easy. They had a clear target and people had lots of good alternatives.

The great boycotts of history did not succeed because millions of people suddenly became heroic activists. They succeeded because buying a different brand of coffee, or choosing a different beer, was something anyone could do on a Tuesday afternoon. The small act, repeated at scale, becomes a political earthquake.

Go to quitgpt.org. Cancel your subscription. Using the free version? Delete the app, because your conversations still feed the machine. Then try an alternative, and tell at least one person why.

OpenAI’s president bet $25m that you would not notice where your money was going, and that, even if you did, you would not care enough to spend 10 seconds switching to something else. Time to prove him wrong."

Tuesday, February 24, 2026

‘It’s the most urgent public health issue’: Dr Rangan Chatterjee on screen time, mental health – and banning social media until 18; The Guardian, February 16, 2026

Emine Saner, The Guardian; ‘It’s the most urgent public health issue’: Dr Rangan Chatterjee on screen time, mental health – and banning social media until 18 

"Chatterjee believes that “the widespread adoption of screens into our children’s lives is the most urgent public health issue of our time”. He was never very political, he says. He is the affable host of a successful health podcast, Feel Better, Live More, and his books strike an optimistic, inspiring tone – but on this issue he is passionate, his frustration obvious. “I think successive governments have been very weak here, and they are failing a whole generation of children. I think they’ve already failed a generation of children.”"

Wednesday, February 18, 2026

Mark Zuckerberg Takes the Stand in Landmark Social Media Addiction Trial; The New York Times, February 18, 2026

 , The New York Times; Mark Zuckerberg Takes the Stand in Landmark Social Media Addiction Trial

"Mr. Zuckerberg’s appearance in court — his first time testifying about child safety in front of a jury — was highly anticipated. Meta, which owns Instagram and Facebook and has more than 3.5 billion users, has come under fire as one of the biggest providers of platforms for teenagers. Parents, as well as tech policy and child safety groups have accused the company of hooking young people on its apps and causing mental health issues that have led to anxiety, depression, eating disorders and self-harm...

In internal documents that surfaced in some of the lawsuits, Mr. Zuckerberg and other Meta leaders repeatedly played down their platforms’ risks to young people, while rejecting employee pleas to bolster youth guardrails and hire additional staff...

K.G.M.’s lawyer, Mark Lanier, said during his opening statement this month that Instagram and YouTube’s apps were built like “digital casinos” that profited off addictive behavior. He pointed to internal documents from Meta and Google, which owns YouTube, comparing their technology to gambling, tobacco and drug use. In a 2015 memo, Mr. Zuckerberg encouraged executives to prioritize increasing the time that teenagers spend on Meta’s apps.

Meta said in its opening statement that K.G.M.’s mental health issues were caused by familial abuse and turmoil. The company presented medical records to show that social media addiction was not a focus of her therapy sessions."

Tuesday, February 10, 2026

Meta and YouTube Created ‘Digital Casinos,’ Lawyers Argue in Landmark Trial; The New York Times, February 9, 2026

Eli Tan and , The New York Times ; Meta and YouTube Created ‘Digital Casinos,’ Lawyers Argue in Landmark Trial

"The trial in the California Superior Court of Los Angeles is the first in a series of landmark cases against Meta, Snap, TikTok and YouTube that test a novel legal theory arguing that tech can be as harmful as casinos and cigarettes.

Teenagers, school districts and states have filed thousands of lawsuits accusing the social media titans of designing platforms that encourage excessive use. Drawing inspiration from a legal playbook used against Big Tobacco last century, lawyers argue that features like infinite scroll, auto video play and algorithmic recommendations have led to compulsive social media use.

The cases pose some of the most significant legal threats to Meta, Snap, TikTok and YouTube, potentially opening them up to new liabilities for users’ well-being. A win for the plaintiffs could prompt more lawsuits and lead to monetary damages, as well as change how social media is designed."

Thursday, February 5, 2026

‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI; The Guardian, February 5, 2026

Anuj Behal, The Guardian ; ‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI


[Kip Currier: The largely unaddressed plight of content moderators became more real for me after reading this haunting 9/9/24 piece in the Washington Post, "I quit my job as a content moderator. I can never go back to who I was before."

As mentioned in the graphic article's byline, content moderator Alberto Cuadra spoke with journalist Beatrix Lockwood. Maya Scarpa's illustrations poignantly give life to Alberto Cuadra's first-hand experiences and ongoing impacts from the content moderation he performed for an unnamed tech company. I talk about Cuadra's experiences and the ethical issues of content moderation, social media, and AI in my Ethics, Information, and Technology book.]


[Excerpt]

"Murmu, 26, is a content moderator for a global technology company, logging on from her village in India’s Jharkhand state. Her job is to classify images, videos and text that have been flagged by automated systems as possible violations of the platform’s rules.

On an average day, she views up to 800 videos and images, making judgments that train algorithms to recognise violence, abuse and harm.

This work sits at the core of machine learning’s recent breakthroughs, which rest on the fact that AI is only as good as the data it is trained on. In India, this labour is increasingly performed by women, who are part of a workforce often described as “ghost workers”.

“The first few months, I couldn’t sleep,” she says. “I would close my eyes and still see the screen loading.” Images followed her into her dreams: of fatal accidents, of losing family members, of sexual violence she could not stop or escape. On those nights, she says, her mother would wake and sit with her...

“In terms of risk,” she says, “content moderation belongs in the category of dangerous work, comparable to any lethal industry.”

Studies indicate content moderation triggers lasting cognitive and emotional strain, often resulting in behavioural changes such as heightened vigilance. Workers report intrusive thoughts, anxiety and sleep disturbances.

A study of content moderators published last December, which included workers in India, identified traumatic stress as the most pronounced psychological risk. The study found that even where workplace interventions and support mechanisms existed, significant levels of secondary trauma persisted."

Tuesday, January 27, 2026

Social Media Giants Face Landmark Legal Tests on Child Safety; The New York Times, January 27, 2026

, The New York Times ; Social Media Giants Face Landmark Legal Tests on Child Safety

"Are social media apps addictive like cigarettes? Are these sites defective products?

Those are the claims that Meta, Snap, TikTok and YouTube will face this year in a series of landmark trials. Teenagers, school districts and states have filed thousands of lawsuits accusing the social media titans of designing platforms that encouraged excessive use by millions of young Americans, leading to personal injury and other harms.

On Tuesday, the first of these bellwether cases is scheduled to start with jury selection in California Superior Court of Los Angeles County. A now-20-year-old Californian identified by the initials K.G.M. filed the lawsuit in 2023, claiming she became addicted to the social media sites as a child and experienced anxiety, depression and body-image issues as a result.

The cases pose one of the most significant legal threats to Meta, Snap, TikTok and YouTube, potentially opening them up to new liabilities for users’ well-being. Drawing inspiration from a legal playbook used against Big Tobacco last century, lawyers plan to use the argument that the companies created addictive products.

A win could open the door to more lawsuits from millions of social media users. It could also lead to huge monetary damages and changes to social media sites’ designs."

Saturday, January 17, 2026

Public Shame Is the Most Effective Tool for Battling Big Tech; The New York Times, January 14, 2026

  , The New York Times; Public Shame Is the Most Effective Tool for Battling Big Tech

"It might be harder to shame the tech companies themselves into making their products safer, but we can shame third-party companies like toymakers, app stores and advertisers into ending partnerships. And with enough public disapproval, legislators might be inspired to act.

In some of the very worst corners of the internet might lie some hope...

Without more public shaming, what seems to be the implacable forward march of A.I. is unstoppable...

As Jay Caspian Kang noted in The New Yorker recently, changing social norms around kids and tech use can be powerful, and reforms like smartphone bans in schools have happened fairly quickly, and mostly on the state and local level."

Thursday, October 30, 2025

From CBS to TikTok, US media are falling to Trump’s allies. This is how democracy crumbles; The Guardian, October 29, 2025

, The Guardian; From CBS to TikTok, US media are falling to Trump’s allies. This is how democracy crumbles

"Democracy may be dying in the US. Whether the patient receives emergency treatment in time will determine whether the condition becomes terminal. Before Donald Trump’s return to the presidency, I warned of “Orbánisation” – in reference to Hungary’s authoritarian leader Viktor Orbán. There, democracy was not extinguished by firing squads or the mass imprisonment of dissidents, but by slow attrition. The electoral system was warped, civil society was targeted and pro-Orbán moguls quietly absorbed the media.

Nine months on, and Orbánisation is in full bloom across the Atlantic. Billionaire Larry Ellison, the Oracle co-founder, and his filmmaker son, David, have become blunt instruments in this process. Trump boasts they are “friends of mine – they’re big supporters of mine”. Larry Ellison, second only to Elon Musk as the world’s richest man, has poured tens of millions into Republican coffers...

US democracy has always been heavily flawed. It is so rigged in favour of wealthy elites that a detailed academic study back in 2014 found that the political system is rigged in favour of what the economic elites want. Yet because, unlike Hungary, the US has no history of dictatorship, with a system of supposed checks and balances, some felt it could never succumb to tyranny. Such complacency has collided with brutal reality. In just nine months, the US has been dragged towards an authoritarian abyss. A warning: Trump has 39 months left in office."