Showing posts with label killer robots. Show all posts
Showing posts with label killer robots. Show all posts

Tuesday, July 11, 2023

Unknown: Killer Robots review – the future of AI will fill you with unholy terror; The Guardian, July 10, 2023

, The Guardian; Unknown: Killer Robots review – the future of AI will fill you with unholy terror

"The dilemma surrounding almost all military inventions – perhaps almost all inventions full stop – is what is slightly grandly called “the dual use problem”. On the one hand, you’ve got drones and robots who can clear buildings without risking soldiers’ lives. On the other, you can weaponise them, autonomise them and use them to take out entire villages without anyone getting their hands dirty. What is that sense of detachment likely to do to the level of carnage in a war overall? Former US defense secretary Bob Work doesn’t think “human intervention in kill decisions” will ever change. I cannot help but pause for a moment to suggest, respectfully, that either the good colonel has never met humanity or that he is the programme’s equivalent of the flight attendant urging people to keep calm as the passenger jet plummets to its fiery doom."

Monday, December 5, 2022

Opinion I’ll say it: I do not think killer robots are a good idea; The Washington Post, December 3, 2022

, The Washington Post ; Opinion

I’ll say it: I do not think killer robots are a good idea

"I am just going to go ahead and say it: I do not think that killer robots are a good idea.

I know that the San Francisco Police Department wants to have killer robots. But I think, sometimes, you do not need to give people what they want. Especially if what they want is killer robots.

I understand that this remark is controversial. But what are columnists for, if not to take these bold stances? So I will say it again: I, for one, think that killer robots are bad. I do not think the robots should kill. I think if you are going to draw a line someplace, killer robots should be on the other side of the line...

Nobody in anything that I have read or seen in real life or fiction sees a killer robot and says, “Ah, good! The killer robot is here, to kill!” And I think there is a reason for that. I would venture that reason is: Killer robots are bad."

Can police use robots to kill? San Francisco voted yes.; The Washington Post, November 30, 2022

 , The Washington Post; Can police use robots to kill? San Francisco voted yes.

"Adam Bercovici, a law enforcement expert and former Los Angeles Police Department lieutenant, told The Post that while policies for robotic lethal force must be carefully written, they could be useful in rare situations. He referenced an active-shooter scenario like the one Dallas officers encountered.

“If I was in charge, and I had that capability, it wouldn’t be the first on my menu,” he said. “But it would be an option if things were really bad.”

Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, worried that San Francisco could instead end up setting a dangerous precedent.

“In my knowledge, this would be the first city to take this step of passing a law authorizing killer robots,” Cahn told The Post.

Cahn expressed concern that the legislation would lead other departments to push for similar provisions, or even to the development of more weaponized robots. In the aftermath of the school shooting in Uvalde, Tex., the police equipment company Axon announced plans to develop drones equipped with Tasers to incapacitate school shooters but canned the idea after nine members of the company’s artificial-intelligence ethics advisory board resigned in protest."

Wednesday, January 5, 2022

Killer Robots Aren’t Science Fiction. A Push to Ban Them Is Growing.; The New York Times, December 17, 2022

Adam SatarianoNick Cumming-Bruce and , The New York Times; Killer Robots Aren’t Science Fiction. A Push to Ban Them Is Growing.

A U.N. conference made little headway this week on limiting development and use of killer robots, prompting stepped-up calls to outlaw such weapons with a new treaty.

"It may have seemed like an obscure United Nations conclave, but a meeting this week in Geneva was followed intently by experts in artificial intelligence, military strategy, disarmament and humanitarian law.

The reason for the interest? Killer robots — drones, guns and bombs that decide on their own, with artificial brains, whether to attack and kill — and what should be done, if anything, to regulate or ban them.

Once the domain of science fiction films like the “Terminator” series and “RoboCop,” killer robots, more technically known as Lethal Autonomous Weapons Systems, have been invented and tested at an accelerated pace with little oversight. Some prototypes have even been used in actual conflicts.

The evolution of these machines is considered a potentially seismic event in warfare, akin to the invention of gunpowder and nuclear bombs."

Friday, December 18, 2015

Ethics on the near-future battlefield; Bulletin of the Atomic Scientists, 12/17/15

Michael L. Gross, Bulletin of the Atomic Scientists; Ethics on the near-future battlefield:
"The US Army’s recent report “Visualizing the Tactical Ground Battlefield in the Year 2050” describes a number of future war scenarios that raise vexing ethical dilemmas. Among the many tactical developments envisioned by the authors, a group of experts brought together by the US Army Research laboratory, three stand out as both plausible and fraught with moral challenges: augmented humans, directed-energy weapons, and autonomous killer robots. The first two technologies affect humans directly, and therefore present both military and medical ethical challenges. The third development, robots, would replace humans, and thus poses hard questions about implementing the law of war without any attending sense of justice...
As we search for answers to these questions, we must remain wary of placing too much stock in technology. Contemporary armed conflict amply demonstrates how relatively weak guerrillas, insurgents, and terrorists find novel ways to overcome advanced technologies through such relatively low-tech tactics as suicide bombings, improvised explosive devices, human shields, hostage taking, and propaganda. There is little doubt that these tactics gain purchase because many state armies endeavor to embrace the “laws of humanity and the requirements of the public conscience,” and, as democracies, often choose to fight with one hand tied behind their backs. The emerging technologies that will accompany future warfare only sharpen this dilemma, particularly as asymmetric war intensifies and some inevitably ask whether killer robots lacking a sense of justice might not be such a bad thing after all."