Biometrics (35)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 13 min
- Kinolab
- 2002
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens.
- Kinolab
- 2002
Preventative Policing and Surveillance Information
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens.
How will predicted crime be prosecuted? Should predicted crime be prosecuted? How could technologies such as the ones shown here be affected for the worse by human bias? How would these devices make racist policing practices even worse? Would certain communities be targeted? Is there ever any justification for constant civil surveillance?
-
- 7 min
- Kinolab
- 2002
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Joe Anderson, the former head of the PreCrime policing program, is named as a future perpetrator and must flee from his former employer. Due to the widespread nature of retinal scanning biometric technology, he is found quickly, and thus must undergo an eye transplant. While recovering in a run-down apartment, the PreCrime officers deploy spider-shaped drones to scan the retinas of everyone in the building.
- Kinolab
- 2002
Retinal Scans and Immediate Identification
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Joe Anderson, the former head of the PreCrime policing program, is named as a future perpetrator and must flee from his former employer. Due to the widespread nature of retinal scanning biometric technology, he is found quickly, and thus must undergo an eye transplant. While recovering in a run-down apartment, the PreCrime officers deploy spider-shaped drones to scan the retinas of everyone in the building.
Is it possible that people would consent to having their retinas scanned in general public places if it meant a more personalized experience of that space? Should government be able to deceive people into giving up their private data, as social media companies already do? How can people protect themselves from retinal scanning and other biometric identification technologies on small and large scales?
-
- 3 min
- Kinolab
- 2017
Eleanor Shellstrop, a deceased selfish woman, ended up in the utopic afterlife The Good Place by mistake after her death. She spins an elaborate web of lies to ensure that she is not sent to be tortured in The Bad Place. In this narrative, she tracks her personal ethical point total with a technology which is compared to a Fitbit. In theory, the more good actions she completes, the higher her score will get. For another narrative on personal ratings/point tracking, see the narratives “Lacie Parts I and II” on the Black Mirror episode “Nosedive.”
- Kinolab
- 2017
Personal Statistics Tracking
Eleanor Shellstrop, a deceased selfish woman, ended up in the utopic afterlife The Good Place by mistake after her death. She spins an elaborate web of lies to ensure that she is not sent to be tortured in The Bad Place. In this narrative, she tracks her personal ethical point total with a technology which is compared to a Fitbit. In theory, the more good actions she completes, the higher her score will get. For another narrative on personal ratings/point tracking, see the narratives “Lacie Parts I and II” on the Black Mirror episode “Nosedive.”
Do corrupt motivations spoil moral deeds? Should digital technologies be used to track personal data that is more abstract that health statistics or number of steps taken? What would be the consequences if such ratings were public?
-
- 1 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Eric is able to use Biometrics to keep Evelyn and Max hostage and get high-level access to the Feed hub. This highlights an example of how computerized security systems might not be able to pick up on hostage situations or forced activity. The Biometrics can recognize their faces, but is unable to pick up on the ‘distress’ visible on Max and Evelyn’s faces that indicate they are in trouble.
- Kinolab
- 2019
Limitations of Biometrics
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Eric is able to use Biometrics to keep Evelyn and Max hostage and get high-level access to the Feed hub. This highlights an example of how computerized security systems might not be able to pick up on hostage situations or forced activity. The Biometrics can recognize their faces, but is unable to pick up on the ‘distress’ visible on Max and Evelyn’s faces that indicate they are in trouble.
Should biometrics be totally trusted with security measures? What sorts of shortfalls of this approach are demonstrated in this narrative?
- ZDNet
- 2021
Facebook’s use of biometrics to develop facial recognition came under scrutiny from those skeptical of users’ privacy protection. The company has just filed a $650 million settlement to close the lawsuit regarding this issue.
- ZDNet
- 2021
- ZDNet
- 2021
Judge approves $650m settlement for Facebook users in privacy, biometrics lawsuit
Facebook’s use of biometrics to develop facial recognition came under scrutiny from those skeptical of users’ privacy protection. The company has just filed a $650 million settlement to close the lawsuit regarding this issue.
What role do you think the government should play in establishing precedent for violations of privacy by technology companies?