Showing posts with label trust. Show all posts
Showing posts with label trust. Show all posts

Thursday, September 12, 2019

The misinformation age; Axios, September 12, 2019

Scott Rosenberg, David Nather, Axios; The misinformation age


"Hostile powers undermining elections. Deepfake video and audio. Bots and trolls, phishing and fake news — plus of course old-fashioned spin and lies. 

Why it matters: The sheer volume of assaults on fact and truth is undermining trust not just in politics and government, but also in business, tech, science and health care as well.
  • Beginning with this article, Axios is launching a series to help you navigate this new avalanche of misinformation, and illuminate its impact on America and the globe, through 2020 and beyond.
Our culture now broadly distrusts most claims to truth. Majorities of Americans say they've lost trust in the federal government and each other — and think that lack of trust gets in the way of solving problems, according to a Pew Research Center survey."

Monday, April 22, 2019

Wary of Chinese Espionage, Houston Cancer Center Chose to Fire 3 Scientists; The New York Times, April 22, 2019

Mihir Zaveri, The New York Times; Wary of Chinese Espionage, Houston Cancer Center Chose to Fire 3 Scientists

"“A small but significant number of individuals are working with government sponsorship to exfiltrate intellectual property that has been created with the support of U.S. taxpayers, private donors and industry collaborators,” Dr. Peter Pisters, the center’s president, said in a statement on Sunday.

“At risk is America’s internationally acclaimed system of funding biomedical research, which is based on the principles of trust, integrity and merit.”

The N.I.H. had also flagged two other researchers at MD Anderson. One investigation is proceeding, the center said, and the evidence did not warrant firing the other researcher.

The news of the firings was first reported by The Houston Chronicle and Science magazine.

The investigations began after Francis S. Collins, the director of the National Institutes of Health, sent a letter in August to more than 10,000 institutions the agency funds, warning of “threats to the integrity of U.S. biomedical research.”"

Friday, March 15, 2019

Review: 'The Inventor' is a coolly appalling portrait of Elizabeth Holmes and the Theranos scandal; The Los Angeles Times, March 14, 2019

Justin Chang, The Los Angeles Times;

Review: 'The Inventor' is a coolly appalling portrait of Elizabeth Holmes and the Theranos scandal


"As a quick glance at this week’s headlines will remind you — a staggering college admissions scandal, a wave of indictments in the cases of Paul Manafort and Jussie Smollett — we are living in deeply fraudulent times. But if there are few people or institutions worthy of our trust anymore, perhaps we can still trust that, eventually, Alex Gibney will get around to making sense of it all. Over the course of his unflagging, indispensable career he has churned out documentaries on Scientology and Enron, Lance Armstrong and Casino Jack — individual case studies in a rich and fascinating investigation of the American hustler at work.
 
Gibney approaches his subjects with the air of an appalled moralist and, increasingly, a grudging connoisseur. His clean, straightforward style, which usually combines smart talking heads, slick graphics and reams of meticulous data, is clearly galvanized by these charismatic individuals, who are pathological in their dishonesty and riveting in their chutzpah. And he is equally fascinated by the reactions, ranging from unquestioning belief to conflicted loyalty, that they foster among their followers and associates, who in many cases shielded them, at least for a while, from public discovery and censure.
 
“The Inventor: Out for Blood in Silicon Valley,” Gibney’s latest exercise in coolly measured outrage, is an engrossing companion piece to his other works in this vein. The subject of this HBO documentary is Elizabeth Holmes, the self-styled biotech visionary who dropped out of Stanford at age 19 and founded a company called Theranos, which promised to bring about a revolution in preventive medicine and personal healthcare. Its top-secret weapon was a compact machine called the Edison, which could purportedly run more than 200 individual tests from just a few drops of blood, obtained with just a prick of the finger.
 
Holmes’ vision of a brave new world — one in which anyone could stop by Walgreens and obtain a comprehensive, potentially life-saving snapshot of their health — proved tantalizing enough to raise more than $400 million and earned her a reputation as possibly the greatest inventor since, well, Thomas Edison. Her investors included Betsy DeVos, Rupert Murdoch and the Waltons; Henry Kissinger, George Shultz and James Mattis sat on her board of directors. But that was all before the Wall Street Journal’s John Carreyrou and other investigative journalists exposed glaring faults in the Edison’s design and sent the company’s $10-billion valuation spiraling down to nothing. Theranos dissolved in 2018, and Holmes and former company president Sunny Balwani were charged with conspiracy and fraud.
 
Full disclosure: As the son of a retired medical technologist who spent more than 30 years testing blood the traditional way, I approached “The Inventor” with great fascination and more than a little schadenfreude. The movie, for its part, seems both magnetized and repelled by its subject, a reaction that it will likely share with its audience. Gibney is perhaps overly fond of deploying intense, lingering close-ups of Holmes’ face and peering deep into her unnerving blue eyes (“She didn’t blink,” a former employee recalls). If the eyes are the windows to the soul, “The Inventor” just keeps looking and looking, as though uncertain whether or not its subject has one."

Tuesday, January 29, 2019

FaceTime Is Eroding Trust in Tech Privacy paranoiacs have been totally vindicated.; The Atlantic, January 29, 2019

Ian Bogost, The Atlantic;

FaceTime Is Eroding Trust in Tech

Privacy paranoiacs have been totally vindicated.

"Trustworthy is hardly a word many people use to characterize big tech these days. Facebook’s careless infrastructure upended democracy. Abuse is so rampant on Twitter and Instagram that those services feel designed to deliver harassment rather than updates from friends. Hacks, leaks, and other breaches of privacy, at companies from Facebook to Equifax, have become so common that it’s hard to believe that any digital information is secure. The tech economy seems designed to steal and resell data."

Thursday, January 10, 2019

The 20 Best TV Dramas Since ‘The Sopranos’; The New York Times, January 10, 2019

The New York Times; The 20 Best TV Dramas Since ‘The Sopranos’

"2014-2017

The Leftovers

Because it pondered the big questions without feeling ponderous.

Damon Lindelof, creator:..

One question the show was always asking was, “How can you emotionally invest in anyone, if you think that they could just slip out of existence in a second?”

Obviously that’s something we contend with in a nondeparture world, because people die. But that feeling of, “I now have an excuse to not emotionally connect to anyone” gets magnified in a world where 2 percent of the world’s population just slipped out."

Saturday, November 17, 2018

How Plato Foresaw Facebook’s Folly; The New York Times, November 16, 2018

Bret Stephens, Opinion Columnist, The New York Times; How Plato Foresaw Facebook’s Folly

[Kip Currier: A must-read opinion piece by Bret Stephens. Bookmark and pass on to others! 

Facebook's interminable ethics failures and catastrophic abdication of any semblance of moral leadership offer glaring case studies for the essential role of ethical decision-making and accountability in organizations--not only in the technology sector but in ALL areas of civic life.

Moreover, where is Facebook’s Board amidst this moral morass? If corporate leaders will not “do the right things”, it is ethically incumbent upon Boards of Trustees to exercise the moral oversight and fiduciary responsibility with which they have been entrusted.]

"The story of the wildly exaggerated promises and damaging unintended consequences of technology isn’t exactly a new one. The real marvel is that it constantly seems to surprise us. Why? 

Part of the reason is that we tend to forget that technology is only as good as the people who use it. We want it to elevate us; we tend to degrade it. In a better world, Twitter might have been a digital billboard of ideas and conversation ennobling the public square. We’ve turned it into the open cesspool of the American mind. Facebook was supposed to serve as a platform for enhanced human interaction, not a tool for the lonely to burrow more deeply into their own isolation.

It’s also true that Facebook and other Silicon Valley giants have sold themselves not so much as profit-seeking companies but as ideal-pursuing movements. Facebook’s mission is “to make the world more open and connected.” Tesla’s goal is “to accelerate the world’s transition to sustainable energy.” Google’s mantra was “Don’t Be Evil,” at least until it quietly dropped the slogan earlier this year. 

But the deeper reason that technology so often disappoints and betrays us is that it promises to make easy things that, by their intrinsic nature, have to be hard...

Start over, Facebook. Do the basics. Stop pretending that you’re about transforming the state of the world. Work harder to operate ethically, openly and responsibly. Accept that the work will take time. Log off Facebook for a weekend. Read an ancient book instead."

Thursday, November 1, 2018

He Promised to Restore Damaged Hearts. Harvard Says His Lab Fabricated Research.; The New York Times, October 29, 2018

Gina Kolata, The New York Times; 
He Promised to Restore Damaged Hearts. Harvard Says His Lab Fabricated Research. 

"For Dr. Piero Anversa, the fall from scientific grace has been long, and the landing hard.

Researchers worldwide once hailed his research as revolutionary, promising the seemingly impossible: a way to grow new heart cells to replace those lost in heart attacks and heart failure, leading killers in the United States.

But Harvard Medical School and Brigham and Women’s Hospital in Boston, his former employers, this month accused Dr. Anversa and his laboratory of extensive scientific malpractice. More than 30 research studies produced over more than a decade contain falsified or fabricated data, officials concluded, and should be retracted. Last year the hospital paid a $10 million settlement to the federal government after the Department of Justice alleged that Dr. Anversa and two members of his team were responsible for fraudulently obtaining research funding from the National Institutes of Health.

“The number of papers is extraordinary,” said Dr. Jeffrey Flier, until 2016 the dean of Harvard Medical School. “I can’t recall another case like this.”"

Monday, July 23, 2018

Embracing the privacy-first mindset in the post-GDPR world; AdNovum Singapore via Enterprise Innovation, July 23, 2018

Leonard Cheong, Managing Director, AdNovum Singapore via Enterprise Innovation; Embracing the privacy-first mindset in the post-GDPR world

"Privacy is a fundamental human right.

This is the proclamation that Apple made when updating their App Store policies to ensure that application developers can’t access consumer data without consent, in a bid to demonstrate their commitment to data privacy.

As the world becomes more digital, privacy has indeed become more sought after and consumers today are only willing to share data with companies they trust. On 25 May 2018 the European Union’s General Data Protection Regulation or EU-GDPR came into effect, sparking numerous conversations on privacy, ethics and compliance."

We Need Transparency in Algorithms, But Too Much Can Backfire; Harvard Business Review, July 23, 2018

Kartik Hosanagar and Vivian Jair, Harvard Business Review; We Need Transparency in Algorithms, But Too Much Can Backfire

"Companies and governments increasingly rely upon algorithms to make decisions that affect people’s lives and livelihoods – from loan approvals, to recruiting, legal sentencing, and college admissions. Less vital decisions, too, are being delegated to machines, from internet search results to product recommendations, dating matches, and what content goes up on our social media feeds. In response, many experts have called for rules and regulations that would make the inner workings of these algorithms transparent. But as Nass’s experience makes clear, transparency can backfire if not implemented carefully. Fortunately, there is a smart way forward."

Friday, April 20, 2018

Why Tech Companies Need a Code of Ethics for Software Development; Entrepreneur, April 19, 2018

Dave West, Entrepreneur; Why Tech Companies Need a Code of Ethics for Software Development
With so much potential for software to go bad, it's important that developers commit to doing good.

"As the race heats up among companies looking to be first-to-market with the next best product or service, considerations about the implications these systems and gadgets may have on society often are overlooked...

Academically, this movement is already in the works. Harvard University and the Massachusetts Institute of Technology (MIT) are jointly offering a new course on the ethics and regulation of artificial intelligence, the University of Texas at Austin recently introduced its Ethical Foundations of Computer Science course and Stanford University is developing a computer science ethics course for next year...

With the absence of an international standardized code of ethics, one solution organizations can implement immediately is to foster a culture among their delivery teams that places ethics in high regard...

One of the most effective ways organizations can achieve transparency is to create their own internal code of ethics. A baseline organizations can use to develop their code of ethics are the five values of Scrum...

The popular Spiderman phrase "with great power comes great responsibility" could not be more applicable to the organizations who are creating and releasing the products that define society. After all, these products are influencing the way people live and interact with each other, every day. This is why big tech companies must take the lead and create their own code of ethics."

Monday, April 9, 2018

Conspiracy videos? Fake news? Enter Wikipedia, the ‘good cop’ of the Internet; The Washington Post, April 6, 2018

Noam Cohen, The Washington Post; Conspiracy videos? Fake news? Enter Wikipedia, the ‘good cop’ of the Internet

"Although it is hard to argue today that the Internet lacks for self-expression, what with self-publishing tools such as Twitter, Facebook and, yes, YouTube at the ready, it still betrays its roots as a passive, non-collaborative medium. What you create with those easy-to-use publishing tools is automatically licensed for use by for-profit companies, which retain a copy, and the emphasis is on personal expression, not collaboration. There is no YouTube community, but rather a Wild West where harassment and fever-dream conspiracies use up much of the oxygen. (The woman who shot three people at YouTube’s headquartersbefore killing herself on Tuesday was a prolific producer of videos, including ones that accused YouTube of a conspiracy to censor her work and deny her advertising revenue.)

Wikipedia, with its millions of articles created by hundreds of thousands of editors, is the exception. In the past 15 years, Wikipedia has built a system of collaboration and governance that, although hardly perfect, has been robust enough to endure these polarized times."

Wednesday, April 4, 2018

The Guardian view on Grindr and data protection: don’t trade our privacy; Guardian, April 3, 2018

Editorial, Guardian; The Guardian view on Grindr and data protection: don’t trade our privacy

"Whether the users were at fault for excessive trust, or lack of imagination, or even whether they were at fault at all for submitting information that would let their potential partners make a better informed choice, as liberal ethics would demand, the next thing to scrutinise is the role of the company itself. Grindr has now said that it will no longer hand over the information, which is an admission that it was wrong to do so in the first place. It also says that the information was always anonymised, and that its policy was perfectly standard practice among digital businesses. This last is perfectly true, and perhaps the most worrying part of the whole story.

We now live in a world where the valuations of giant companies are determined by the amount of personal data they hold on third parties, who frequently have no idea how much there is, nor how revealing it is. As well as the HIV status, and last test date, Grindr collected and passed on to third parties its users’ locations, their phone identification numbers, and emails. These went to two companies that promise to make it easier to deliver personalised advertisements to phones based on the users’ locations and to increase the amount of time they spend looking at apps on their phones. The data was in theory anonymised, although repeated experiments have shown that the anonymity of personal information on the internet is pretty easily cracked in most cases."

After the Facebook scandal it’s time to base the digital economy on public v private ownership of data; Guardian, March 31, 2018

Evgeny Morozov, Guardian; After the Facebook scandal it’s time to base the digital economy on public v private ownership of data

"Finally, we can use the recent data controversies to articulate a truly decentralised, emancipatory politics, whereby the institutions of the state (from the national to the municipal level) will be deployed to recognise, create, and foster the creation of social rights to data. These institutions will organise various data sets into pools with differentiated access conditions. They will also ensure that those with good ideas that have little commercial viability but promise major social impact would receive venture funding and realise those ideas on top of those data pools.

Rethinking many of the existing institutions in which citizens seem to have lost trust along such lines would go a long way towards addressing the profound sense of alienation from public and political life felt across the globe. It won’t be easy but it can still be done. This, however, might not be the case 10 or even five years from now, as the long-term political and economic costs of data extractivism come to the surface. The data wells inside ourselves, like all those other drilling sites, won’t last for ever either."

Monday, March 19, 2018

Data scandal is huge blow for Facebook – and efforts to study its impact on society; Guardian, March 18, 2018

Olivia Solon, Guardian; Data scandal is huge blow for Facebook – and efforts to study its impact on society

"The revelation that 50 million people had their Facebook profiles harvested so Cambridge Analytica could target them with political ads is a huge blow to the social network that raises questions about its approach to data protection and disclosure.


As Facebook executives wrangle on Twitter over the semantics of whether this constitutes a “breach”, the result for users is the same: personal data extracted from the platform and used for a purpose to which they did not consent.
Facebook has a complicated track record on privacy. Its business model is built on gathering data. It knows your real name, who your friends are, your likes and interests, where you have been, what websites you have visited, what you look like and how you speak."

Wednesday, March 7, 2018

Top priest shares ‘The Ten Commandments of A.I.’ for ethical computing; internet of business, February 28, 2018

Chris Middleton, internet of business; Top priest shares ‘The Ten Commandments of A.I.’ for ethical computing

"A senior clergyman and government advisor has written what he calls “the Ten Commandments of AI”, to ensure the technology is applied ethically and for social good.

AI has been put forward as the saviour of businesses and national economies, but how to ensure that the technology isn’t abused? The Rt Rev the Lord Bishop of Oxford (pictured below), a Member of the House of Lords Select Committee on Artificial Intelligence, set out his proposals at a policy debate in London, attended by representatives of government, academia, and the business world.

Speaking on 27 February at a Westminster eForum Keynote Seminar, Artificial Intelligence and Robotics: Innovation, Funding and Policy Priorities, the Bishop set out his ten-point plan, after chairing a debate on trust, ethics, and cybersecurity."

Ethics and AI conference launched by CMU, K&L Gates; Pittsburgh Business Times, March 6, 2018

, Pittsburgh Business Times; Ethics and AI conference launched by CMU, K&L Gates

"The inaugural Carnegie Mellon University-K&L Gates Conference on Ethics and Artificial Intelligence is slated for April 9-10.

Leaders from industry, academia and government will explore ethical issues surrounding emerging technologies at the two-day event in Pittsburgh."

Monday, February 5, 2018

It’s Time to End the Scam of Flying Pets; New York Times, February 4, 2018

David Leonhardt, New York Times; It’s Time to End the Scam of Flying Pets

"The whole bizarre situation is a reminder of why trust matters so much to a well-functioning society. The best solution, of course, would be based not on some Transportation Department regulation but on simple trust. People who really needed service animals could then bring on them planes without having to carry documents.

Maybe a trust-based system will return at some point. But it won’t return automatically. When trust breaks down and small bits of dishonesty become normal, people need to make a conscious effort to restore basic decency."

Monday, January 22, 2018

As technology develops, so must journalists’ codes of ethics; Guardian, January 21, 2018

Paul Chadwick, Guardian; 

As technology develops, so must journalists’ codes of ethics


"AI collaboration poses ethical issues for, among others, courts that use it in sentencing, for operators of weapons systems, and for medical specialists. The potential benefits of AI, together with the widespread recognition that the accountability of AI decision-making matters greatly, give me confidence that the challenge of making AI accountable to humans will be met. Until it is, each collaboration requires attention. In journalism, the long-unchanging codes of ethics need to be revisited to make specific provision for this transitional era. A clause something like: “When using artificial intelligence to augment your journalism, consider its compatibility with the values of this code. Software that ‘thinks’ is increasingly useful, but it does not necessarily gather or process information ethically.”"

Wednesday, June 14, 2017

National Geographic Traveler Used My Photo for a Cover and Never Paid Me; PetaPixel, June 12, 2017

Mustafa Turgut, PetaPixel; National Geographic Traveler Used My Photo for a Cover and Never Paid Me

"After a couple of months of receiving no payment, I emailed them again asking them when they would be paying for the use of my photo on their cover.

They never responded to my email, and they have not responded to any contact attempt since then.

Frustrated, I began emailing the global National Geographic headquarters with my story. Although I have tried contacting headquarters over and over, I have yet to receive a single response.

I then began posting on National Geographic social media pages in 2013, but all of my posts were deleted shortly after I wrote them."

Friday, June 9, 2017

Security, Privacy, Trust Remain Challenges For The Internet Of Things; Intellectual Property Watch, June 7, 2017

Elise De Geyter, Intellectual Property Watch; Security, Privacy, Trust Remain Challenges For The Internet Of Things

"It is “amazing” what can be done via the internet and the Internet of Things is a “game changer,” a speaker said during the Internet of Things Week currently taking place in Geneva. Ninety percent of the data in the world has been created in the last two years. And the speed of data creation is still increasing, another speaker said."