Jennifer Rankin, The Guardian; Russian media ‘spreading Covid-19 disinformation’
Leaked EU report says pro-Kremlin outlets seeking to aggravate public health crisis
"“Whoever is spreading the disinformation is essentially playing with
people’s lives,” Stano said. “Every responsible social media or media
user should be aware of this: that there is a lot of misinformation
circulating around … Double check, triple check, go to a media you
really trust and look at the sources.”"
Showing posts with label tech companies. Show all posts
Showing posts with label tech companies. Show all posts
Friday, March 20, 2020
Monday, October 28, 2019
The biggest lie tech people tell themselves — and the rest of us; Vox, October 8, 2019
Rose Eveleth
, Vox;
"With great power comes great responsibility
Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.”
And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means."
The biggest lie tech people tell themselves — and the rest of us
They see facial recognition, smart diapers, and surveillance devices as inevitable evolutions. They’re not.
"With great power comes great responsibility
Often consumers don’t have much power of selection at all. Those who run small businesses find it nearly impossible to walk away from Facebook, Instagram, Yelp, Etsy, even Amazon. Employers often mandate that their workers use certain apps or systems like Zoom, Slack, and Google Docs. “It is only the hyper-privileged who are now saying, ‘I’m not going to give my kids this,’ or, ‘I’m not on social media,’” says Rumman Chowdhury, a data scientist at Accenture. “You actually have to be so comfortable in your privilege that you can opt out of things.”
And so we’re left with a tech world claiming to be driven by our desires when those decisions aren’t ones that most consumers feel good about. There’s a growing chasm between how everyday users feel about the technology around them and how companies decide what to make. And yet, these companies say they have our best interests in mind. We can’t go back, they say. We can’t stop the “natural evolution of technology.” But the “natural evolution of technology” was never a thing to begin with, and it’s time to question what “progress” actually means."
Thursday, October 24, 2019
‘Don’t leave campus’: Parents are now using tracking apps to watch their kids at college; The Washington Post, October 22, 2019
Abby Ohlheiser, The Washington Post; ‘Don’t leave campus’: Parents are now using tracking apps to watch their kids at college
"Many
parents install tracking apps with good intentions, said Stacey
Steinberg, a law professor at the University of Florida who has studied
how technology impacts raising families and privacy. “We don’t want our
kids to screw up,” she said. “We don’t want them to get hurt. Technology
offers us new ways to feel like we are protecting them — both from
others and from themselves.
“But
kids need autonomy from their parents, especially when they reach
adulthood,” Steinberg added. “If we want our kids to trust us, if we
want our kids to believe they are capable of making wise decisions, then
our actions need to show it. Valuing their privacy is one way to do
so.”"
The Black-and-White World of Big Tech; The New York Times, October 24, 2019
Kara Swisher, The New York Times; The Black-and-White World of Big Tech
Mark
Zuckerberg presented us with an either-or choice of free speech —
either we have it or we’re China. Tech leaders have a duty to admit it’s
much more complicated.
"Mr. Zuckerberg presented us with an
either-or choice of free speech — either we have free speech or we’re
China. “Whether you like Facebook or not, I think we need to come
together and stand for voice and free expression,” he said with an
isn’t-this-obvious tone.
But, as
anyone who has lived in the real world knows, it’s much more complex.
And that was the main problem with his speech — and it’s also what is at
the crux of the myriad concerns we have with tech these days: Big
Tech’s leaders frame the debate in binary terms.
Mr. Zuckerberg missed an opportunity to recognize that there has been a
hidden and high price of all the dazzling tech of the last decade. In
offering us a binary view for considering the impact of their
inventions, many digital leaders avoid thinking harder about the costs
of technological progress."
Tuesday, October 22, 2019
Under digital surveillance: how American schools spy on millions of kids; The Guardian, October 22, 2019
Lois Beckett, The Guardian; Under digital surveillance: how American schools spy on millions of kids
"The new school surveillance technology doesn’t turn off when the school day is over: anything students type in official school email accounts, chats or documents is monitored 24 hours a day, whether students are in their classrooms or their bedrooms.
Tech companies are also working with schools to monitor students’ web searches and internet usage, and, in some cases, to track what they are writing on public social media accounts.
Parents and students are still largely unaware of the scope and intensity of school surveillance, privacy experts say, even as the market for these technologies has grown rapidly, fueled by fears of school shootings, particularly in the wake of the Parkland shooting in February 2018, which left 17 people dead."
"The new school surveillance technology doesn’t turn off when the school day is over: anything students type in official school email accounts, chats or documents is monitored 24 hours a day, whether students are in their classrooms or their bedrooms.
Tech companies are also working with schools to monitor students’ web searches and internet usage, and, in some cases, to track what they are writing on public social media accounts.
Parents and students are still largely unaware of the scope and intensity of school surveillance, privacy experts say, even as the market for these technologies has grown rapidly, fueled by fears of school shootings, particularly in the wake of the Parkland shooting in February 2018, which left 17 people dead."
Saturday, October 19, 2019
Mark Zuckerberg doesn’t understand free speech in the 21st century; The Guardian, October 18, 2019
Siva Vaidhyanathan, The Guardian; Mark Zuckerberg doesn’t understand free speech in the 21st century
"The problem of the 21st century is cacophony. Too many people are yelling at the same time. Attentions fracture. Passions erupt. Facts crumble. It’s increasingly hard to deliberate deeply about complex crucial issues with an informed public. We have access to more knowledge yet we can’t think and talk like adults about serious things...
The thing is, a thriving democracy needs more than motivation, the ability to find and organize like-minded people. Democracies also need deliberation. We have let the institutions that foster discussion among well informed, differently-minded people crumble. Soon all we will have left is Facebook. Look at Myanmar to see how well that works."
"The problem of the 21st century is cacophony. Too many people are yelling at the same time. Attentions fracture. Passions erupt. Facts crumble. It’s increasingly hard to deliberate deeply about complex crucial issues with an informed public. We have access to more knowledge yet we can’t think and talk like adults about serious things...
The thing is, a thriving democracy needs more than motivation, the ability to find and organize like-minded people. Democracies also need deliberation. We have let the institutions that foster discussion among well informed, differently-minded people crumble. Soon all we will have left is Facebook. Look at Myanmar to see how well that works."
Thursday, October 17, 2019
How to Stop the Abuse of Location Data; The New York Times, October 16, 2019
Jeff Glueck, The New York Times; How to Stop the Abuse of Location Data
"Companies should have to maintain data with adequate security protections, including encryption at rest and in transit. Employees at companies that collect data on millions of consumers should undergo privacy and ethics training. Companies should require clients and other people who use the data to promise that they will not use the tech and data for unethical or discriminatory practices — and should penalize those that act unethically. Regulation should force companies to create ethics committees where management and employees must discuss their privacy and ethical data use policies regularly."
There are no formal rules for what is ethical — or even legal — in the location data business. That needs to change.
"Companies should have to maintain data with adequate security protections, including encryption at rest and in transit. Employees at companies that collect data on millions of consumers should undergo privacy and ethics training. Companies should require clients and other people who use the data to promise that they will not use the tech and data for unethical or discriminatory practices — and should penalize those that act unethically. Regulation should force companies to create ethics committees where management and employees must discuss their privacy and ethical data use policies regularly."
Wednesday, October 2, 2019
Congress and Trump Agreed They Want a National Privacy Law. It Is Nowhere in Sight.; The New York Times, October 1, 2019
David McCabe, The New York Times;
"But after months of talks, a national privacy
law is nowhere in sight...
The struggle to regulate consumer data shows how
lawmakers have largely been unable to turn rage at Silicon Valley’s practices
into concrete action...
But the fervor to crack down on Silicon Valley has
produced only a single new law, a bill to prevent sex trafficking online...
The
United States has some laws that protect consumers’ privacy, like medical
information collected by a doctor. But Congress has never set an overarching
national standard for how most companies gather and use data. Regulators in
Europe, in contrast, put strict new privacy rules into effect last year.
Many tech
companies built lucrative businesses off their users’ personal information,
often by offering a “free” product in return.”
Saturday, September 14, 2019
Orwellabama? Crimson Tide Track Locations to Keep Students at Games; The New York Times, September 12, 2019
Billy Witz, The New York Times; Orwellabama? Crimson Tide Track Locations to Keep Students at Games
Coach
Nick Saban gets peeved at students leaving routs early. An app ties
sticking around to playoff tickets, but also prompts concern from
students and privacy watchdogs.
"Greg
Byrne, Alabama’s athletic director, said privacy concerns rarely came
up when the program was being discussed with other departments and
student groups. Students who download the Tide Loyalty Points app will
be tracked only inside the stadium, he said, and they can close the app —
or delete it — once they leave the stadium. “If anybody has a phone,
unless you’re in airplane mode or have it off, the cellular companies
know where you are,” he said.
Wednesday, September 4, 2019
The Ethics of Hiding Your Data From the Machines; Wired, August 22, 2019
Molly Wood, Wired;
"In the case of the company I met with, the data collection they’re doing is all good. They want every participant in their longitudinal labor study to opt in, and to be fully informed about what’s going to happen with the data about this most precious and scary and personal time in their lives.
But when I ask what’s going to happen if their company is ever sold, they go a little quiet."
The Ethics of Hiding Your Data From the Machines
"In the case of the company I met with, the data collection they’re doing is all good. They want every participant in their longitudinal labor study to opt in, and to be fully informed about what’s going to happen with the data about this most precious and scary and personal time in their lives.
But when I ask what’s going to happen if their company is ever sold, they go a little quiet."
Thursday, April 18, 2019
'Disastrous' lack of diversity in AI industry perpetuates bias, study finds; The Guardian, April 16, 2019
Kari Paul, The Guardian;
'Disastrous' lack of diversity in AI industry perpetuates bias, study finds
"Lack of diversity in the artificial intelligence field has reached “a moment of reckoning”, according to new findings published by a New York University research center. A “diversity disaster” has contributed to flawed systems that perpetuate gender and racial biases found the survey, published by the AI Now Institute, of more than 150 studies and reports.
The AI field, which is overwhelmingly white and male, is at risk of replicating or perpetuating historical biases and power imbalances, the report said. Examples cited include image recognition services making offensive classifications of minorities, chatbots adopting hate speech, and Amazon technology failing to recognize users with darker skin colors. The biases of systems built by the AI industry can be largely attributed to the lack of diversity within the field itself, the report said...
The report released on Tuesday cautioned against addressing diversity in the tech industry by fixing the “pipeline” problem, or the makeup of who is hired, alone. Men currently make up 71% of the applicant pool for AI jobs in the US, according to the 2018 AI Index, an independent report on the industry released annually. The AI institute suggested additional measures, including publishing compensation levels for workers publicly, sharing harassment and discrimination transparency reports, and changing hiring practices to increase the number of underrepresented groups at all levels."
Monday, April 15, 2019
EU approves tougher EU copyright rules in blow to Google, Facebook; Reuters, April 15, 2019
Foo Yun Chee, Reuters; EU approves tougher EU copyright rules in blow to Google, Facebook
"Under the new rules, Google and other online platforms will have to sign licensing agreements with musicians, performers, authors, news publishers and journalists to use their work.
The European Parliament gave a green light last month to a proposal that has pitted Europe’s creative industry against tech companies, internet activists and consumer groups."
"Under the new rules, Google and other online platforms will have to sign licensing agreements with musicians, performers, authors, news publishers and journalists to use their work.
The European Parliament gave a green light last month to a proposal that has pitted Europe’s creative industry against tech companies, internet activists and consumer groups."
Tuesday, April 9, 2019
Real or artificial? Tech titans declare AI ethics concerns; AP, April 7, 2019
Matt O'Brien and Rachel Lerman, AP; Real or artificial? Tech titans declare AI ethics concerns
"The biggest tech companies want you to know that they’re taking special care to ensure that their use of artificial intelligence to sift through mountains of data, analyze faces or build virtual assistants doesn’t spill over to the dark side.
But their efforts to assuage concerns that their machines may be used for nefarious ends have not been universally embraced. Some skeptics see it as mere window dressing by corporations more interested in profit than what’s in society’s best interests.
“Ethical AI” has become a new corporate buzz phrase, slapped on internal review committees, fancy job titles, research projects and philanthropic initiatives. The moves are meant to address concerns over racial and gender bias emerging in facial recognition and other AI systems, as well as address anxieties about job losses to the technology and its use by law enforcement and the military.
But how much substance lies behind the increasingly public ethics campaigns? And who gets to decide which technological pursuits do no harm?"
"The biggest tech companies want you to know that they’re taking special care to ensure that their use of artificial intelligence to sift through mountains of data, analyze faces or build virtual assistants doesn’t spill over to the dark side.
But their efforts to assuage concerns that their machines may be used for nefarious ends have not been universally embraced. Some skeptics see it as mere window dressing by corporations more interested in profit than what’s in society’s best interests.
“Ethical AI” has become a new corporate buzz phrase, slapped on internal review committees, fancy job titles, research projects and philanthropic initiatives. The moves are meant to address concerns over racial and gender bias emerging in facial recognition and other AI systems, as well as address anxieties about job losses to the technology and its use by law enforcement and the military.
But how much substance lies behind the increasingly public ethics campaigns? And who gets to decide which technological pursuits do no harm?"
Monday, April 8, 2019
Are big tech’s efforts to show it cares about data ethics another diversion?; The Guardian, April 7, 2019
John Naughton, The Guardian; Are big tech’s efforts to show it cares about data ethics another diversion?
"No less a source than Gartner, the technology analysis company, for example, has also sussed it and indeed has logged “data ethics” as one of its top 10 strategic trends for 2019...
Google’s half-baked “ethical” initiative is par for the tech course at the moment. Which is only to be expected, given that it’s not really about morality at all. What’s going on here is ethics theatre modelled on airport-security theatre – ie security measures that make people feel more secure without doing anything to actually improve their security.
The tech companies see their newfound piety about ethics as a way of persuading governments that they don’t really need the legal regulation that is coming their way. Nice try, boys (and they’re still mostly boys), but it won’t wash.
Postscript: Since this column was written, Google has announced that it is disbanding its ethics advisory council – the likely explanation is that the body collapsed under the weight of its own manifest absurdity."
"No less a source than Gartner, the technology analysis company, for example, has also sussed it and indeed has logged “data ethics” as one of its top 10 strategic trends for 2019...
Google’s half-baked “ethical” initiative is par for the tech course at the moment. Which is only to be expected, given that it’s not really about morality at all. What’s going on here is ethics theatre modelled on airport-security theatre – ie security measures that make people feel more secure without doing anything to actually improve their security.
The tech companies see their newfound piety about ethics as a way of persuading governments that they don’t really need the legal regulation that is coming their way. Nice try, boys (and they’re still mostly boys), but it won’t wash.
Postscript: Since this column was written, Google has announced that it is disbanding its ethics advisory council – the likely explanation is that the body collapsed under the weight of its own manifest absurdity."
Thursday, April 4, 2019
THE PROBLEM WITH AI ETHICS; The Verge, April 3, 2019
James Vincent, The Verge;
"Part of the problem is that Silicon Valley is convinced that it can police itself, says Chowdhury.
“It’s just ingrained in the thinking there that, ‘We’re the good guys, we’re trying to help,” she says. The cultural influences of libertarianism and cyberutopianism have made many engineers distrustful of government intervention. But now these companies have as much power as nation states without the checks and balances to match. “This is not about technology; this is about systems of democracy and governance,” says Chowdhury. “And when you have technologists, VCs, and business people thinking they know what democracy is, that is a problem.”
The solution many experts suggest is government regulation. It’s the only way to ensure real oversight and accountability. In a political climate where breaking up big tech companies has become a presidential platform, the timing seems right."
THE PROBLEM WITH AI ETHICS
Is Big Tech’s embrace of AI ethics boards actually helping anyone?
"Part of the problem is that Silicon Valley is convinced that it can police itself, says Chowdhury.
“It’s just ingrained in the thinking there that, ‘We’re the good guys, we’re trying to help,” she says. The cultural influences of libertarianism and cyberutopianism have made many engineers distrustful of government intervention. But now these companies have as much power as nation states without the checks and balances to match. “This is not about technology; this is about systems of democracy and governance,” says Chowdhury. “And when you have technologists, VCs, and business people thinking they know what democracy is, that is a problem.”
The solution many experts suggest is government regulation. It’s the only way to ensure real oversight and accountability. In a political climate where breaking up big tech companies has become a presidential platform, the timing seems right."
Tuesday, March 19, 2019
Myspace loses all content uploaded before 2016; The Guardian, March 18, 2019
Alex Hern, The Guardian; Myspace loses all content uploaded before 2016
Some have questioned how the embattled company, which was purchased by Time Inc in 2016, could make such a blunder."
Faulty server migration blamed for mass deletion of songs, photos and video
"Myspace, the once mighty social network, has lost every single piece of content uploaded to its site before 2016, including millions of songs, photos and videos with no other home on the internet.
"Myspace, the once mighty social network, has lost every single piece of content uploaded to its site before 2016, including millions of songs, photos and videos with no other home on the internet.
The company is blaming a faulty server migration for the mass
deletion, which appears to have happened more than a year ago, when the
first reports appeared of users unable to access older content. The
company has confirmed to online archivists that music has been lost
permanently, dashing hopes that a backup could be used to permanently
protect the collection for future generations...
Some have questioned how the embattled company, which was purchased by Time Inc in 2016, could make such a blunder."
Friday, March 15, 2019
I Almost Died Riding an E-Scooter Like 99 percent of users, I wasn’t wearing a helmet.; Slate, March 14, 2019
Rachel Withers, Slate;
"I’ve been rather flippant with friends about what happened because it’s the only way I know how to deal. It’s laughable that you’d get seriously injured scooting. But this isn’t particularly funny. People are always going to be idiots, yes, but idiot people are currently getting seriously injured, in ways that might have been prevented, because tech companies flippantly dumped their product all over cities, without an adequate helmet solution. Facebook’s “move fast and break things” mantra can be applied to many tech companies, but in the case of e-scooters, it might just be “move fast and break skulls.”"
I Almost Died Riding an E-Scooter
Like 99 percent of users, I wasn’t wearing a helmet.
"I’ve been rather flippant with friends about what happened because it’s the only way I know how to deal. It’s laughable that you’d get seriously injured scooting. But this isn’t particularly funny. People are always going to be idiots, yes, but idiot people are currently getting seriously injured, in ways that might have been prevented, because tech companies flippantly dumped their product all over cities, without an adequate helmet solution. Facebook’s “move fast and break things” mantra can be applied to many tech companies, but in the case of e-scooters, it might just be “move fast and break skulls.”"
Wednesday, March 13, 2019
Mark Zuckerberg And The Tech World Still Do Not Understand Ethics; Forbes, March 11, 2019
Derek Lidow, Forbes;
Mark Zuckerberg And The Tech World Still Do Not Understand Ethics
"Why the widespread blindness to the ethical and social dangers of tech startups specifically? Here are five of the principal causes:
Tech startups see themselves as saviors of the world...
Complex technology and tech business models deflect investor due diligence...
Expectations for technology startups encourage expedient, not ethical, decision making...
We’ve fetishized disruption...
Tech promises founders and investors vast—vast—amounts of money."
Tuesday, February 19, 2019
Drones and big data: the next frontier in the fight against wildlife extinction; The Guardian, February 18, 2019
Anthea Lipsett, The Guardian; Drones and big data: the next frontier in the fight against wildlife extinction
"Yet it’s not more widely used because few researchers have the skills to use this type of technology. In biology, where many people are starting to use drones, few can code an algorithm specifically for their conservation or research problem, Wich says. “There’s a lot that needs to be done to bridge those two worlds and to make the AI more user-friendly so that people who can’t code can still use the technology.”
The solutions are more support from tech companies, better teaching in universities to help students overcome their fears of coding, and finding ways to link technologies together in an internet-of-things concept where all the different sensors, including GPS, drones, cameras and sensors, work together."
"Yet it’s not more widely used because few researchers have the skills to use this type of technology. In biology, where many people are starting to use drones, few can code an algorithm specifically for their conservation or research problem, Wich says. “There’s a lot that needs to be done to bridge those two worlds and to make the AI more user-friendly so that people who can’t code can still use the technology.”
The solutions are more support from tech companies, better teaching in universities to help students overcome their fears of coding, and finding ways to link technologies together in an internet-of-things concept where all the different sensors, including GPS, drones, cameras and sensors, work together."
Thursday, February 14, 2019
Parkland school turns to experimental surveillance software that can flag students as threats; The Washington Post, February 13, 2019
Drew Harwell, The Washington Post; Parkland school turns to experimental surveillance software that can flag students as threats
"The specter of student violence is pushing school
leaders across the country to turn their campuses into surveillance
testing grounds on the hope it’ll help them detect dangerous people
they’d otherwise miss. The supporters and designers of Avigilon, the AI
service bought for $1 billion last year by tech giant Motorola
Solutions, say its security algorithms could spot risky behavior with
superhuman speed and precision, potentially preventing another attack.
But
the advanced monitoring technologies ensure that the daily lives of
American schoolchildren are subjected to close scrutiny from systems
that will automatically flag certain students as suspicious, potentially
spurring a response from security or police forces, based on the work
of algorithms that are hidden from public view.
The
camera software has no proven track record for preventing school
violence, some technology and civil liberties experts argue. And the
testing of their algorithms for bias and accuracy — how confident the
systems are in identifying possible threats — has largely been conducted
by the companies themselves."
Subscribe to:
Posts (Atom)