Showing posts with label Facebook. Show all posts
Showing posts with label Facebook. Show all posts

Monday, May 22, 2023

Meta Fined $1.3 Billion for Violating E.U. Data Privacy Rules; The New York Times, May 22, 2023

Adam Satariano, The New York Times ; Meta Fined $1.3 Billion for Violating E.U. Data Privacy Rules

"Meta on Monday was fined a record 1.2 billion euros ($1.3 billion) and ordered to stop transferring data collected from Facebook users in Europe to the United States, in a major ruling against the social media company for violating European Union data protection rules...

The ruling, which is a record fine under the General Data Protection Regulation, or G.D.P.R., could affect data related to photos, friend connections and direct messages stored by Meta. It has the potential to bruise Facebook’s business in Europe, particularly if it hurts the company’s ability to target ads. Last month, Susan Li, Meta’s chief financial officer, told investors that about 10 percent of its worldwide ad revenue came from ads delivered to Facebook users in E.U. countries. In 2022, Meta had revenue of nearly $117 billion."

Monday, March 14, 2022

Sandy Hook review: anatomy of an American tragedy – and the obscenity of social media; The Guardian, March 13, 2022

 , The Guardian; Sandy Hook review: anatomy of an American tragedy – and the obscenity of social media

"Those recommendations are the result of the infernal algorithms which are at the heart of the business models of Facebook and YouTube and are probably more responsible for the breakdown in civil society in the US and the world than anything else invented.

“We thought the internet would give us this accelerated society of science and information,” says Lenny Pozner, whose son Noah was one of the Sandy Hook victims. But “really, we’ve gone back to flat earth”."

Friday, December 31, 2021

Americans widely distrust Facebook, TikTok and Instagram with their data, poll finds; The Washington Post, December 22, 2021

 

, The Washington Post; Americans widely distrust Facebook, TikTok and Instagram with their data, poll finds

"According to the survey, 72 percent of Internet users trust Facebook “not much” or “not at all” to responsibly handle their personal information and data on their Internet activity. About 6 in 10 distrust TikTok and Instagram, while slight majorities distrust WhatsApp and YouTube. Google, Apple and Microsoft receive mixed marks for trust, while Amazon is slightly positive with 53 percent trusting the company at least “a good amount.” (Amazon founder Jeff Bezos owns The Washington Post.)

Only 10 percent say Facebook has a positive impact on society, while 56 percent say it has a negative impact and 33 percent say its impact is neither positive nor negative. Even among those who use Facebook daily, more than three times as many say the social network has a negative rather than a positive impact."

Friday, December 3, 2021

Congress Takes Aim at the Algorithms; Wired, December 2, 2021

 Gilad Edelman, Wired; Congress Takes Aim at the Algorithms


"“I agree in principle that there should be liability, but I don’t think we’ve found the right set of terms to describe the processes we’re concerned about,” said Jonathan Stray, a visiting scholar at the Berkeley Center for Human-Compatible AI who studies recommendation algorithms. “What’s amplification, what’s enhancement, what’s personalization, what’s recommendation?”...

[Mary Anne] Franks proposes something both simpler and more sweeping: that Section 230 not apply to any company that “manifests deliberate indifference to unlawful material or conduct.” Her collaborator Danielle Citron has argued that companies should have to prove they took reasonable steps to prevent a certain type of harm before being granted immunity. If something like that became law, engagement-based algorithms wouldn’t go away—but the change could still be significant. The Facebook Papers revealed by Haugen, for example, show that Facebook very recently had little or no content-moderation infrastructurein regions like the Middle East and Africa, where hundreds of millions of its users live. Currently Section 230 largely protects US companies even in foreign markets. But imagine if someone defamed or targeted for harassment by an Instagram post in Afghanistan, where as of 2020 Facebook hadn’t even fully translated its forms for reporting hate speech, could sue under an “indifference” standard. The company would suddenly have a much stronger incentive to make sure its algorithms aren’t favoring material that could land it in court."

Monday, October 25, 2021

How Facebook neglected the rest of the world, fueling hate speech and violence in India; The Washington Post, October 24, 2021

 

 and 
The Washington Post; How Facebook neglected the rest of the world, fueling hate speech and violence in India

A trove of internal documents show Facebook didn’t invest in key safety protocols in the company’s largest market.

"In February 2019, not long before India’s general election, a pair of Facebook employees set up a dummy account to better understand the experience of a new user in the company’s largest market. They made a profile of a 21-year-old woman, a resident of North India, and began to track what Facebook showed her.

At first, her feed filled with soft-core porn and other, more harmless, fare. Then violence flared in Kashmir, the site of a long-running territorial dispute between India and Pakistan. Indian Prime Minister Narendra Modi, campaigning for reelection as a nationalist strongman, unleashed retaliatory airstrikes that India claimed hit a terrorist training camp.

Soon, without any direction from the user, the Facebook account was flooded with pro-Modi propaganda and anti-Muslim hate speech. “300 dogs died now say long live India, death to Pakistan,” one post said, over a background of laughing emoji faces. “These are pakistani dogs,” said the translated caption of one photo of dead bodies lined-up on stretchers, hosted in the News Feed.

An internal Facebook memo, reviewed by The Washington Post, called the dummy account test an “integrity nightmare” that underscored the vast difference between the experience of Facebook in India and what U.S. users typically encounter. One Facebook worker noted the staggering number of dead bodies."

Wednesday, October 6, 2021

FACEBOOK EXEC: WE'RE NOT LIKE BIG TOBACCO BECAUSE SO MANY PEOPLE USE OUR PRODUCT; Vanity Fair, October 4, 2021

Eric Lutz, Vanity Fair; FACEBOOK EXEC: WE'RE NOT LIKE BIG TOBACCO BECAUSE SO MANY PEOPLE USE OUR PRODUCT

"“No one at Facebook is malevolent,” Haugen added, “but the incentives are misaligned.”

That, of course, speaks to the big issue facing Mark Zuckerberg: Though he insists that his platform is a force for good that is occasionally corrupted by the uglier parts of humanity, it may in fact be the case that the platform is corrupt by its very nature—and that talk of a safer Facebook, as Clegg suggested the company was working to deliver, is a bit like the “safer cigarettes” tobacco companies began marketing in response to health concerns more than half a century ago. That comparison, between Big Tech and Big Tobacco, has been made a lot recently, including by yours truly. But, asked by CNN’s Brian Stelter Sunday about the parallels, Clegg dismissed them out of hand as “misleading.”

“A part of me feels like I’m interviewing the head of a tobacco company right now,” Stelter said. “Part of me feels like I’m interviewing the head of a giant casino that gets rich by tricking its customers and making them addicted.”

“I think they’re profoundly false,” Clegg said of the analogies. “I don’t think it’s remotely like tobacco. I mean, social media apps, they’re apps. People download them on their phones, and why do they do that? I mean, there has to be a reason why a third of the world’s population enjoys using these apps.” 

His point about free will is well-taken; Zuckerberg obviously isn’t forcing anyone to scroll. But rejecting comparisons to an addictive product by pointing out how many people around the world use it hardly seems like a great defense; in fact, as NPR’s David Gura pointed out, the line actually made the parallels more pronounced."

Facebook runs the coward’s playbook to smear the whistleblower; The Verge, October 5, 2021

, The Verge; Facebook runs the coward’s playbook to smear the whistleblower

 

"Facebook has chosen to respond to whistleblower Frances Haugen in the most cowardly way possible: by hiding Mark Zuckerberg, the man ultimately responsible for Facebook’s decisions, and beginning the process of trying to smear and discredit Haugen.

This is some Big Tobacco bullshit — precisely what sleazeball PR guru John Scanlon was hired to do when Jeffrey Wigand blew the whistle on tobacco company Brown and Williamson. Scanlon’s task was to change “the story of B&W to a narrative about Wigand’s personality.”

Of course, that strategy “backfired completely,” Vanity Fair reported in 2004. It probably won’t work here, either. One senator, Edward Markey of Massachusetts, has already called Haugen “a 21st-century American hero,” adding that “our nation owes you a huge debt of gratitude.”...

But the funniest part is the absence of Mark Zuckerberg, Facebook’s CEO and the only shareholder with the power to replace himself. Zuckerberg started Facebook as a Hot-or-Not clone — which almost certainly would negatively affect teen girls’ self-esteem. (At least he is consistent, I guess.) The decisions Haugen alleges, which put profits ahead of morals, have also enriched him more than anyone else. The buck stops, quite literally, with him. So where is he?"

Facebook whistleblower: The company knows it’s harming people and the buck stops with Zuckerberg; CNBC, October 5, 2021

Lauren Feiner, CNBC; Facebook whistleblower: The company knows it’s harming people and the buck stops with Zuckerberg

[Frances Haugen] also said she believes a healthy social media platform is possible to achieve and that Facebook presents “false choices ... between connecting with those you love online and your personal privacy.”...

‘Big Tobacco moment’

Opening the hearing Tuesday, Blumenthal called on Zuckerberg to come before the committee to explain the company’s actions. He called the company “morally bankrupt” for rejecting reforms offered by its own researchers.

Haugen said Zuckerberg’s unique position as CEO and founder with a majority of voting shares in the company makes him accountable only to himself.'

There are “no similarly powerful companies that are as unilaterally controlled,” Haugen said.

Blumenthal said the disclosures by Haugen ushered in a “Big Tobacco moment,” a comparison Haugen echoed in her own testimony. Blumenthal recalled his own work suing tobacco companies as Connecticut’s attorney general, remembering a similar time when enforcers learned those companies had conducted research that showed the harmful effects of their products.

Sen. Roger Wicker, R-Miss., chairman of the Commerce Committee, called the hearing “part of the process of demystifying Big Tech.”"

Here are 4 key points from the Facebook whistleblower's testimony on Capitol Hill; NPR, October 5, 2021

Bobby Allyn, NPR; Here are 4 key points from the Facebook whistleblower's testimony on Capitol Hill

"Research shows Facebook coveted young users, despite health concerns.

Of particular concern to lawmakers on Tuesday was Instagram's impact on young children.

Haugen has leaked one Facebook study that found that 13.5 percent of U.K. teen girls in one survey say their suicidal thoughts became more frequent.

Another leaked study found 17% of teen girls say their eating disorders got worse after using Instagram.

About 32% of teen girls said that when they felt bad about their bodies, Instagram made them feel worse, Facebook's researchers found, which was first reported by the Journal. 

Sen. Marsha Blackburn, R-Tenn., accused Facebook of intentionally targeting children under age 13 with an "addictive" product — despite the app requiring users be 13 years or older. 

"It is clear that Facebook prioritizes profit over the well-being of children and all users," she said. 

Blumenthal echoed this concern. 

"Facebook exploited teens using powerful algorithms that amplified their insecurities," Blumenthal said. "I hope we will discuss as to whether there is such a thing as a safe algorithm.""

Whistleblower says Facebook is a US 'national security issue'; Fox News, October 5, 2021

Caitlin McFall |, Fox News; Whistleblower says Facebook is a US 'national security issue'

"Haugen said her testimony was not an attempt to shut down Facebook, but rather to push Congress to dive into the complex arena of regulating social media giants.

Democrats and Republicans applauded her testimony and in rare bipartisan fashion agreed more is needed to be done to address growing concerns surrounding the social media network." 

Facebook whistleblower revealed on '60 Minutes,' says the company prioritized profit over public good; CNN, October 4, 2021

Clare Duffy , CNN; Facebook whistleblower revealed on '60 Minutes,' says the company prioritized profit over public good

"The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money," Haugen told "60 Minutes." 

"60 Minutes" correspondent Scott Pelly quoted one internal Facebook (FB) document as saying: "We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.""

Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation; 60 Minutes, October 4, 2021

Scott Pelley, 60 Minutes ; Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation

"Her name is Frances Haugen. That is a fact that Facebook has been anxious to know since last month when an anonymous former employee filed complaints with federal law enforcement. The complaints say Facebook's own research shows that it amplifies hate, misinformation and political unrest—but the company hides what it knows. One complaint alleges that Facebook's Instagram harms teenage girls. What makes Haugen's complaints unprecedented is the trove of private Facebook research she took when she quit in May. The documents appeared first, last month, in the Wall Street Journal. But tonight, Frances Haugen is revealing her identity to explain why she became the Facebook whistleblower.

Frances Haugen: The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money. 

Frances Haugen is 37, a data scientist from Iowa with a degree in computer engineering and a Harvard master's degree in business. For 15 years she's worked for companies including Google and Pinterest.

Frances Haugen: I've seen a bunch of social networks and it was substantially worse at Facebook than anything I'd seen before."

Thursday, May 6, 2021

Facebook Ban On Donald Trump Will Hold, Social Network's Oversight Board Rules; NPR, May 5, 2021

 , NPR; Facebook Ban On Donald Trump Will Hold, Social Network's Oversight Board Rules

"Facebook was justified in its decision to suspend then-President Donald Trump after the Jan. 6 insurrection at the U.S. Capitol, the company's Oversight Board said on Wednesday.

That means the company does not have to reinstate Trump's access to Facebook and Instagram immediately. But the panel said the company was wrong to impose an indefinite ban and said Facebook has six months to either restore Trump's account, make his suspension permanent, or suspend him for a specific period of time."

Thursday, July 30, 2020

Congress forced Silicon Valley to answer for its misdeeds. It was a glorious sight; The Guardian, July 30, 2020

, The Guardian; Congress forced Silicon Valley to answer for its misdeeds. It was a glorious sight

"As David Cicilline put it: “These companies as they exist today have monopoly power. Some need to be broken up, all need to be properly regulated and held accountable.” And then he quoted Louis Brandeis, who said, “We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can’t have both.”"

Sunday, May 31, 2020

Think Outside the Box, Jack; The New York Times, May 30, 2020

Think Outside the Box, JackTrump, Twitter and the society-crushing pursuit of monetized rage.

"The Wall Street Journal had a chilling report a few days ago that Facebook’s own research in 2018 revealed that “our algorithms exploit the human brain’s attraction to divisiveness. If left unchecked,” Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”

Mark Zuckerberg shelved the research...

“The shareholders of Facebook decided, ‘If you can increase my stock tenfold, we can put up with a lot of rage and hate,’” says Scott Galloway, professor of marketing at New York University’s Stern School of Business.

“These platforms have very dangerous profit motives. When you monetize rage at such an exponential rate, it’s bad for the world. These guys don’t look left or right; they just look down. They’re willing to promote white nationalism if there’s money in it. The rise of social media will be seen as directly correlating to the decline of Western civilization.”"

Thursday, January 30, 2020

Facebook pays $550m settlement for breaking Illinois data protection law; The Guardian, January 30, 2020

Alex Hern, The Guardian; Facebook pays $550m settlement for breaking Illinois data protection law

"Facebook has settled a lawsuit over facial recognition technology, agreeing to pay $550m (£419m) over accusations it had broken an Illinois state law regulating the use of biometric details...

It is one of the largest payouts for a privacy breach in US history, a marker of the strength of Illinois’s nation-leading privacy laws. The New York Times, which first reported the settlement, noted that the sum “dwarfed” the $380m penalty the credit bureau Equifax agreed to pay over a much larger customer data breach in 2017."

Tuesday, January 14, 2020

Digital Threats On 2020 Elections; Texas Public Radio, January 11, 2020

Michel Martin, Texas Public Radio; Digital Threats On 2020 Elections

"Siva Vaidhyanathan has been writing about these concerns. He is a professor of media studies at the University of Virginia. We spoke earlier about why he thinks digital democracy will face its greatest test in 2020."

"MARTIN: Do you have a sense of, you know, based on your research and that of others, whether there is some throughline to these groups that are engaging in these disinformation campaigns around the world? Like, what's their end goal? Do we - is there, say, a single source or a few sources - is there anything - you know, what do we know?

VAIDHYANATHAN: There doesn't seem to be a single source, but there seems to be thematic coherence. In other words, if there is an extreme authoritarian political force in the Philippines - and there is - and there's an extreme authoritarian political force in Ukraine - let's say trying to be imposed from across the border in Russia - those forces are going to learn from each other. It's very easy to mimic the strategy of another one. So what it means is if you're of that ilk, if you want to disrupt democracy and undermine any form of governance that might support the rule of law and limit corruption, et cetera, you are going to try to flood the political sphere with nonsense, with stuff that will divide society, stuff that will turn people against each other, especially against minorities or against immigrants."

Monday, November 4, 2019

Facebook and Twitter spread Trump’s lies, so we must break them up; The Guardian, November 3, 2019

Robert Reich, The Guardian; Facebook and Twitter spread Trump’s lies, so we must break them up 

"The reason 45% of Americans rely on Facebook for news and Trump’s tweets reach 66 million is because these platforms are near monopolies, dominating the information marketplace. No TV network, cable giant or newspaper even comes close. Fox News’ viewership rarely exceeds 3 million. The New York Times has 4.7 million subscribers.

Facebook and Twitter aren’t just participants in the information marketplace. They’re quickly becoming the information marketplace."

Saturday, October 19, 2019

Mark Zuckerberg doesn’t understand free speech in the 21st century; The Guardian, October 18, 2019

Siva Vaidhyanathan, The Guardian; Mark Zuckerberg doesn’t understand free speech in the 21st century

"The problem of the 21st century is cacophony. Too many people are yelling at the same time. Attentions fracture. Passions erupt. Facts crumble. It’s increasingly hard to deliberate deeply about complex crucial issues with an informed public. We have access to more knowledge yet we can’t think and talk like adults about serious things...

The thing is, a thriving democracy needs more than motivation, the ability to find and organize like-minded people. Democracies also need deliberation. We have let the institutions that foster discussion among well informed, differently-minded people crumble. Soon all we will have left is Facebook. Look at Myanmar to see how well that works."