Privacy (134)

View options:

Find narratives by ethical themes or by technologies.

FILTERreset filters
Themes
  • Privacy
  • Accountability
  • Transparency and Explainability
  • Human Control of Technology
  • Professional Responsibility
  • Promotion of Human Values
  • Fairness and Non-discrimination
Show more themes
Technologies
  • AI
  • Big Data
  • Bioinformatics
  • Blockchain
  • Immersive Technology
Show more technologies
Additional Filters:
  • Media Type
  • Availability
  • Year
    • 1916 - 1966
    • 1968 - 2018
    • 2019 - 2069
  • Duration
  • 7 min
  • Kinolab
  • 2002
image description
Retinal Scans and Immediate Identification

In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Joe Anderson, the former head of the PreCrime policing program, is named as a future perpetrator and must flee from his former employer. Due to the widespread nature of retinal scanning biometric technology, he is found quickly, and thus must undergo an eye transplant. While recovering in a run-down apartment, the PreCrime officers deploy spider-shaped drones to scan the retinas of everyone in the building.

  • Kinolab
  • 2002
  • 10 min
  • New York Times
  • 2019
image description
As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias

Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.

  • New York Times
  • 2019
  • 7 min
  • Vice
  • 2019
image description
Academics Confirm Major Predictive Policing Algorithm is Fundamentally Flawed

An academic perspective on an algorithm created by PredPol to “predict crime.” Unless every single crime is reported, and unless and police pursue all types of crimes committed by all people equally, it’s impossible to have a reinforcement learning system that predicts crime itself.Rather, police find crimes in the same places they’ve been told to look for them, feeding the algorithm ineffective data and allowing unjust targeting of communities of color by the police to continue based on trust in the algorithm.

  • Vice
  • 2019
  • 3 min
  • CNET
  • 2019
image description
Thanks to Equifax breach, 4 US agencies don’t properly verify your data, GAO finds

US Government agencies rely on outdated verification methods, increasing the risk of identify theft.

  • CNET
  • 2019
  • 7 min
  • The New York Times
  • 2019
image description
She Was Arrested at 14. Then Her Photo Went to a Biometrics Database

Biometric facial recognition software, specifically that used with arrest photos in the NYPD, makes extensive use of children’s arrest photos despite a far lower accuracy rate.

  • The New York Times
  • 2019
  • 5 min
  • MIT Technology Review
  • 2019
image description
Police across the US are training crime-predicting AIs on falsified data

In the case of the New Orleans Police Department, along with other cities, data used to train predictive crime algorithms was inconsistent and “dirty” to begin with, making the results disproportionately targeted toward disadvantaged communities.

  • MIT Technology Review
  • 2019
Load more