Showing posts with label predictive policing. Show all posts
Showing posts with label predictive policing. Show all posts

Sunday, November 24, 2019

Data leak reveals how China 'brainwashes' Uighurs in prison camps; BBC, November 24, 2019

BBC; Data leak reveals how China 'brainwashes' Uighurs in prison camps

"The leaked Chinese government documents, which the ICIJ have labelled "The China Cables", include a nine-page memo sent out in 2017 by Zhu Hailun, then deputy-secretary of Xinjiang's Communist Party and the region's top security official, to those who run the camps...

The memo includes orders to:
  • "Never allow escapes"
  • "Increase discipline and punishment of behavioural violations"
  • "Promote repentance and confession"
  • "Make remedial Mandarin studies the top priority"
  • "Encourage students to truly transform"
  • "[Ensure] full video surveillance coverage of dormitories and classrooms free of blind spots"
The documents reveal how every aspect of a detainee's life is monitored and controlled: "The students should have a fixed bed position, fixed queue position, fixed classroom seat, and fixed station during skills work, and it is strictly forbidden for this to be changed.

"Implement behavioural norms and discipline requirements for getting up, roll call, washing, going to the toilet, organising and housekeeping, eating, studying, sleeping, closing the door and so forth."...

The leaked documents also reveal how the Chinese government uses mass surveillance and a predictive-policing programme that analyses personal data."

Friday, February 16, 2018

Congress is worried about AI bias and diversity; Quartz, February 15, 2018

Dave Gershgorn, Quartz; Congress is worried about AI bias and diversity

"Recent research from the MIT Media Lab maintains that facial recognition is still significantly worse for people of color, however.
“This is not a small thing,” Isbell said of his experience. “It can be quite subtle, and you can go years and years and decades without even understanding you are injecting these kinds of biases, just in the questions that you’re asking, the data you’re given, and the problems you’re trying to solve.”
In his opening statement, Isbell talked about biased data in artificial intelligence systems today, including predictive policing and biased algorithms used in predicting recidivism rates.
“It does not take much imagination to see how being from a heavily policed area raises the chances of being arrested again, being convicted again, and in aggregate leads to even more policing of the same areas, creating a feedback loop,” he said. “One can imagine similar issues with determining it for a job, or credit-worthiness, or even face recognition and automated driving.”"