Showing posts with label Facebook. Show all posts
Showing posts with label Facebook. Show all posts

Thursday, December 20, 2018

Why Should Anyone Believe Facebook Anymore?; Wired, December 19, 2018

Fred Vogelstein, Wired;

Why Should Anyone Believe Facebook Anymore?


"Americans are weird about their tycoons. We have a soft spot for success, especially success from people as young as Zuckerberg was when he started Facebook. But we hate it when they become as super-rich and powerful as he is now and seem accountable to no one. We'll tolerate rogues like Larry Ellison, founder and CEO of Oracle, who once happily admitted to hiring investigators to search Bill Gates' trash. Ellison makes no effort to hide the fact that he's in it for the money and the power. But what people despise more than anything is what we have now with tech companies in Silicon Valley, especially with Facebook: greed falsely wrapped in sanctimony.

Facebook gave the world a great new tool for staying connected. Zuckerberg even pitched it as a better internet—a safe space away from the anonymous trolls lurking everywhere else online. But it’s now rather debatable whether Facebook is really a better internet that is making the world a better place, or just another big powerful corporation out to make as much money as possible. Perhaps the world would be happier with Zuckerberg and Facebook, and the rest of their Silicon Valley brethren, if they stopped pretending to be people and businesses they are not."

Facebook Didn’t Sell Your Data; It Gave It Away In exchange for even more data about you from Amazon, Netflix, Spotify, Microsoft, and others; The Atlantic, December 19, 2018

Alexis C. Madrigal, The Atlantic;

Facebook Didn’t Sell Your Data; It Gave It Away


"By the looks of it, other tech players have been happy to let Facebook get beaten up while their practices went unexamined. And then, in this one story, the radioactivity of Facebook’s data hoard spread basically across the industry. There is a data-industrial complex, and this is what it looked like."

How You Can Help Fight the Information Wars: Silicon Valley won’t save us. We’re on our own.; The New York Times, December 18, 2018

Kara Swisher, The New York Times;

How You Can Help Fight the Information Wars:

Silicon Valley won’t save us. We’re on our own.

[Kip Currier: A rallying cry to all persons passionate about and/or working on issues related to information literacy and evaluating information...]

"For now, it’s not clear what we can do, except take control of our own individual news consumption. Back in July, in fact, Ms. DiResta advised consumer restraint as the first line of defense, especially when encountering information that any passably intelligent person could guess might have been placed by a group seeking to manufacture discord.

“They’re preying on your confirmation bias,” she said. “When content is being pushed to you, that’s something that you want to see. So, take the extra second to do the fact-check, even if it confirms your worst impulses about something you absolutely hate — before you hit the retweet button, before you hit the share button, just take the extra second.”

If we really are on our own in this age of information warfare, as the Senate reports suggest, there’s only one rule that can help us win it: See something, say nothing."

Facebook’s Data Sharing and Privacy Rules: 5 Takeaways From Our Investigation; The New York Times, December 18, 2018

Nicholas Confessore, Michael LaForgia and Gabriel J.X. Dance, The New York Times;

Facebook’s Data Sharing and Privacy Rules: 5 Takeaways From Our Investigation

 

"You are the product: That is the deal many Silicon Valley companies offer to consumers. The users get free search engines, social media accounts and smartphone apps, and the companies use the personal data they collect — your searches, “likes,” phone numbers and friends — to target and sell advertising.

But an investigation by The New York Times, based on hundreds of pages of internal Facebook documents and interviews with about 50 former employees of Facebook and its partners, reveals that the marketplace for that data is even bigger than many consumers suspected. And Facebook, which collects more information on more people than almost any other private corporation in history, is a central player.

Here are five takeaways from our investigation."

As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants; The New York Times, December 18, 2018

Gabriel J.X. Dance, Michael LaForgia and Nicholas Confessore, The New York Times; As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants

"For years, Facebook gave some of the world’s largest technology companies more intrusive access to users’ personal data than it has disclosed, effectively exempting those business partners from its usual privacy rules, according to internal records and interviews.

The special arrangements are detailed in hundreds of pages of Facebook documents obtained by The New York Times. The records, generated in 2017 by the company’s internal system for tracking partnerships, provide the most complete picture yet of the social network’s data-sharing practices. They also underscore how personal data has become the most prized commodity of the digital age, traded on a vast scale by some of the most powerful companies in Silicon Valley and beyond."

Sunday, December 9, 2018

The Empress of Facebook: My Befuddling Dinner With Sheryl Sandberg; Wired, December 7, 2018

Virginia Heffernan, Wired; The Empress of Facebook: My Befuddling Dinner With Sheryl Sandberg

"When you’re making money hand over fist, and your company seems to be on the right side of history, it’s natural to think you’re a very moral and whole person, who has made some lovely decisions, and who has a lot to teach other women about work and families. But what about … when the company founders?...

“You know, when I was a girl, the idea that the British Empire could ever end was absolutely inconceivable,” Doris Lessing once said. “And it just disappeared, like all the other empires.”

Empires vanish. The memes that kept them glued together for a short time—from "Dieu et mon droit" to "Bring the world closer together"—are exposed as fictions of state. And the leaders are, surprise, mortals with Napoleon complexes."

Sunday, November 25, 2018

Do You Have a Moral Duty to Leave Facebook?; The New York Times, November 24, 2018

S. Matthew Liao, The New York Times; Do You Have a Moral Duty to Leave Facebook?


“I joined Facebook in 2008, and for the most part, I have benefited from being on it. Lately, however, I have wondered whether I should delete my Facebook account. As a philosopher with a special interest in ethics, I am using “should” in the moral sense. That is, in light of recent events implicating Facebook in objectionable behavior, is there a duty to leave it?"

Tuesday, November 20, 2018

Emattled and in over his head, Mark Zuckerberg should — at least — step down as Facebook chairman; The Washington Post, November 19, 2018

Margaret Sullivan, The Washington Post; Emattled and in over his head, Mark Zuckerberg should — at least — step down as Facebook chairman

"Facebook founder Mark Zuckerberg once set out a bit of digital-world wisdom that became his company’s informal motto: “Move fast and break things.”

After the past week’s developments, the 34-year-old should declare mission accomplished — and find something else to do for the next few decades.

Because he’s shown that he’s incapable of leading the broken behemoth that is Facebook.

Leaders — capable leaders — don’t do what Zuckerberg has done in the face of disaster that they themselves have presided over.

They don’t hide and deny.

They don’t blame-shift."

Monday, November 19, 2018

Yes, Facebook made mistakes in 2016. But we weren’t the only ones.; The Washington Post, November 17, 2018

Alex Stamos, The Washington Post; Yes, Facebook made mistakes in 2016. But we weren’t the only ones.

"Alex Stamos is a Hoover fellow and adjunct professor at Stanford University. He served as the chief security officer at Facebook until August...

It is time for us to come together to protect our society from future information operations. While it appears Russia and other U.S. adversaries sat out the 2018 midterms, our good fortune is unlikely to extend through a contentious Democratic presidential primary season and raucous 2020 election.

First, Congress needs to codify standards around political advertising. The current rules restricting the use of powerful online advertising platforms have been adopted voluntarily and by only a handful of companies. Congress needs to update Nixon-era laws to require transparency and limit the ability of all players, including legitimate domestic actors, to micro-target tiny segments of the population with divisive political narratives. It would be great to see Facebook, Google and Twitter propose helpful additions to legislation instead of quietly opposing it.

Second, we need to draw a thoughtful line between the responsibilities of government and the large technology companies. The latter group will always need to act in a quasi-governmental manner, making judgments on political speech and operating teams in parallel to the U.S. intelligence community, but we need more clarity on how these companies make decisions and what powers we want to reserve to our duly elected government. Many areas of cybersecurity demand cooperation between government and corporations, and our allies in France and Germany provide models of how competent defensive cybersecurity responsibility can be built in a democracy."

 

Facebook deserves criticism. The country deserves solutions.; The Washington Post, November 18, 2018

Editorial Board, The Washington Post; Facebook deserves criticism. The country deserves solutions.

"WHAT HAPPENS now? That is the essential question following the New York Times’s troubling investigation into Facebook’s response to Russian interference on its platform. The article has prompted sharp criticism of the company from all quarters, and Facebook deserves the blowback. But Americans deserve solutions. There are a few places to start."

Sunday, November 18, 2018

To regulate AI we need new laws, not just a code of ethics; The Guardian, October 28, 2018

Paul Chadwick, The Guardian; To regulate AI we need new laws, not just a code of ethics

"For a sense of Facebook’s possible future EU operating environment, Zuckerberg should read the Royal Society’s new publication about the ethical and legal challenges of governing artificial intelligence. One contribution is by a senior European commission official, Paul Nemitz, principal adviser, one of the architects of the EU’s far-reaching General Data Protection Regulation, which took effect in May this year.

Nemitz makes clear the views are his own and not necessarily those of the European commission, but the big tech companies might reasonably see his article, entitled “Constitutional democracy and technology in the age of artificial intelligence”, as a declaration of intent.

“We need a new culture of technology and business development for the age of AI which we call ‘rule of law, democracy and human rights by design’,” Nemitz writes. These core ideas should be baked into AI, because we are entering “a world in which technologies like AI become all pervasive and are actually incorporating and executing the rules according to which we live in large part”.

To Nemitz, “the absence of such framing for the internet economy has already led to a widespread culture of disregard of the law and put democracy in danger, the Facebook Cambridge Analytica scandal being only the latest wake-up call”."

Facebook and the Fires; The New York Times, November 15, 2018

Kara Swisher, The New York Times; Facebook and the Fires


"Don’t Be Afraid of Self-Reflection

That man in the mirror is typically a man, and a young, white, privileged one, whose capacity for self-reflection is about as big as Donald Trump’s ability to stop hate-tweeting. But self-reflection is the hallmark of maturity and good decision-making. Of all the interviews I have done in Silicon Valley, I keep coming back to the one I did with Mr. Zuckerberg this summer, in which I pressed him to reflect on how his invention had caused deaths in places like India and Myanmar.

After trying several times to get an answer from him, I got frustrated: “What kind of responsibility do you feel?” I said I would feel sick to my stomach to know that people died possibly “because of something I invented. What does that make you feel like? What do you do when you see that? What do you do yourself? What’s your emotion?”

Mr. Zuckerberg’s answer left me cold. And also more than a little worried for the future of his company. It’s bad enough not to be able to anticipate disaster; it’s worse, after disaster strikes, to not be able to reflect on how it happened.

“I mean, my emotion is feeling a deep sense of responsibility to try to fix the problem,” he said. “I don’t know, that’s a … that’s the most productive stance.”

But it’s not the most productive stance. As with those California fires, putting out the flames is important. But understanding how they got started in the first place, to stop it from happening again, is what actually keeps us from hurtling over the edge."

Saturday, November 17, 2018

Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis; The New York Times, November 14, 2018

Sheera Frenkel, Nicholas Confessore, Cecilia Kang, Matthew Rosenberg and Jack Nicas, The New York Times; Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis

"Like other technology executives, Mr. Zuckerberg and Ms. Sandberg cast their company as a force for social good. Facebook’s lofty aims were emblazoned even on securities filings: “Our mission is to make the world more open and connected.”

But as Facebook grew, so did the hate speech, bullying and other toxic content on the platform. When researchers and activists in Myanmar, India, Germany and elsewhere warned that Facebook had become an instrument of government propaganda and ethnic cleansing, the company largely ignored them. Facebook had positioned itself as a platform, not a publisher. Taking responsibility for what users posted, or acting to censor it, was expensive and complicated. Many Facebook executives worried that any such efforts would backfire."

Wednesday, August 8, 2018

What Does It Mean to Ban Alex Jones?; The Atlantic, August 7, 2018

Alexis C. Madrigal, The Atlantic; What Does It Mean to Ban Alex Jones?

"In banning the Infowars page, Facebook took the next logical step in restricting access to Infowars content, but it still hasn’t outright banned the domain, and it has not disclosed how the News Feed algorithm is dealing with URLs from Infowars.com.  

All of which is to say: There are many kinds of bans, and they each represent a different tool technology companies can use to police speech. Platforms can weaken the distribution of content they don’t like. They can ban the discovery of content they don’t like, as Apple has with Jones’s podcasts. Platforms can decline to host content they don’t like, as YouTube and Facebook have with InfoWars videos and pages, respectively. Or platforms can ban the presence of content they don’t like, regardless of where it is hosted or discovered."

Twitter will not ban InfoWars conspiracy theorist Alex Jones; BBC, August 8, 2018

BBC; Twitter will not ban InfoWars conspiracy theorist Alex Jones

"In a series of tweets on Tuesday, Twitter CEO and co-founder Jack Dorsey explained the platform's decision, confirming it would not be following in the footsteps of others like Apple and Spotify and removing Mr Jones' and InfoWars' content...

Mr Dorsey said the accounts had not violated the platform's rules, but vowed to suspend them if they ever did so.

In his explanation, Mr Dorsey said it would be wrong to "succumb and simply react to outside pressure" instead of sticking to the company's codified principles.

He also implied one-off actions risked fuelling new conspiracy theories in the long-run, and said it was critical for journalists to "document, validate and refute" unsubstantiated rumours like the ones spread by Mr Jones "so people can form their own opinions"."

Tuesday, August 7, 2018

India asks telcos to find ways to block Facebook, WhatsApp in case of misuse; Reuters, August 7, 2018

Reuters; India asks telcos to find ways to block Facebook, WhatsApp in case of misuse

"India has asked its telecom operators to find ways of blocking applications such as Facebook and messaging app WhatsApp in the case of misuse, according to a document seen by Reuters.

India has in recent months intensified efforts to crack down on mass message forwards after it found that people were using social media and messaging apps to spread rumors and stoke public anger.

WhatsApp in particular has faced the wrath of Indian regulators after false messages circulated on the messaging platform led to a series of lynchings and mob beatings across the country."

Thursday, August 2, 2018

The Shape of Mis- and Disinformation; Slate, July 26, 2018

[Podcast] April Glaser and Will Oremus, Slate; The Shape of Mis- and Disinformation

"In recent weeks, Facebook and YouTube have strained to explain why they won’t ban Alex Jones’ Infowars, which has used its verified accounts to spread false news and dangerous conspiracy theories on the platforms. Meanwhile, the midterms are approaching, and Facebook won’t say definitively whether the company has found any efforts by foreign actors to disrupt the elections. Facebook did recently say that it will start to remove misinformation if it may lead to violence, a response to worrisome trends in Myanmar, India, other countries. The social media platforms are being called on to explain how they deal with information that is wrong—a question made even more complicated because the problem takes so many forms.

To understand the many forms of misinformation and disinformation on social media, we recently spoke with Claire Wardle, the executive director of First Draft, a nonprofit news-literacy and fact-checking outfit based at Harvard University’s Kennedy School, for Slate’s tech podcast If Then. We discussed how fake news spreads on different platforms, where it’s coming from, and how journalists might think—or rethink—their role in covering it"

Monday, July 23, 2018

Facebook's pledge to eliminate misinformation is itself fake news ; The Guardian, July 20, 2018

Judd Legum, The Guardian; Facebook's pledge to eliminate misinformation is itself fake news

"The production values are high and the message is compelling. In an 11-minute mini-documentary, Facebook acknowledges its mistakes and pledges to “fight against misinformation”.

“With connecting people, particularly at our scale, comes an immense amount of responsibility,” an unidentified Facebook executive in the film solemnly tells a nodding audience of new company employees.

An outdoor ad campaign by Facebook strikes a similar note, plastering slogans like “Fake news is not your friend” at bus stops around the country.

But the reality of what’s happening on the Facebook platform belies its gauzy public relations campaign."

Friday, June 15, 2018

University of Central Florida fraternity members accused of posting revenge porn on Facebook; CNN, June 14, 2018

Sara O'Brien, CNN; University of Central Florida fraternity members accused of posting revenge porn on Facebook

"The use of a private Facebook page and other online sites to spread nonconsensual pornography -- which is also commonly referred to as "revenge porn" -- isn't a new phenomenon. Closed Facebook groups were at the center of a Penn State fraternity case, in which men were allegedly posting compromising pictures of women on a private Facebook page, as well as a nude photo scandal involving the Marines.

While Facebook has been working to help combat the spread of revenge porn, it also is grappling with hidden groups that share content that could violate its standards...

One in eight American social media users has been a target of nonconsensual pornography, according to a 2017 study conducted by the Cyber Civil Rights Initiative."

Tuesday, May 1, 2018

Westworld Spoilers Club season 2, episode 2: Reunion The second episode of the season drops subtle clues with big ramifications; The Verge, April 30, 2018

Bryan Bishop, The Verge; Westworld Spoilers Club season 2, episode 2: Reunion

The second episode of the season drops subtle clues with big ramifications


[SPOILERS BELOW]





"...[O]n the matter of the true agenda of the parks themselves, the episode’s revelations raise questions that the show will almost certainly have to engage. For 30 years, Delos parks have been secretly gathering data on their guests. How is that data being used? Have guests been blackmailed, extorted, or otherwise had the records of their trips used against them as futuristic, Wild West kompromat? And what would the corporate consequences be if the existence of such a project was made public? Given that Bernard was not giving proper access to the drone host lab, it seems evident that only people at the highest levels are aware of the data collection initiative, with non-networked hosts used in the facility to help cut down on the chance of leaks.

Given all that, Peter Abernathy — and the data he’s carrying in his head — becomes much more than just a moving plot device. He is quite literally the future of Delos, Inc. itself. Should he fall into the wrong hands, with the data collection initiative made public, it could take down the entire company. It’s a timely storyline, coming right at the time that online services like Facebook are facing more public scrutiny than ever. And no doubt that’s exactly what Joy and Nolan are aiming for."