Computer Vision (40)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 5 min
- Wired
- 2019
Axon’s novel use of an ethics committee led to a decision to not use facial recognition programs on the body cameras which they provide to police department, on the basis of latent racial bias and privacy concerns. While this is a beneficial step, companies and government offices at multiple levels debate over when and how facial recognition should be deployed and limited.
- Wired
- 2019
-
- 5 min
- Wired
- 2019
Taser User Says It Wont Use Biometrics In BodyCams
Axon’s novel use of an ethics committee led to a decision to not use facial recognition programs on the body cameras which they provide to police department, on the basis of latent racial bias and privacy concerns. While this is a beneficial step, companies and government offices at multiple levels debate over when and how facial recognition should be deployed and limited.
Should facial recognition ever be used in police body cameras, even if it does theoretically evolve to eliminate bias? How can citizens and governments have more power in limiting facial recognition and enforcing a more widespread use of ethics boards?
-
- 2 min
- Wired
- 2019
Synthetic human faces are able to be generated by a machine learning algorithm.
- Wired
- 2019
-
- 2 min
- Wired
- 2019
Artificial Intelligence Is Coming For Our Faces
Synthetic human faces are able to be generated by a machine learning algorithm.
What are some consequences to AI being able to render fake yet believable human faces?
-
- 7 min
- n/a
- 2018
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
- n/a
- 2018
-
- 7 min
- n/a
- 2018
Stealing Ur Feelings
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
Did you feel your results truly reflected your perception of yourself? What are the consequences of a machine missing all sorts of nuances and labelling you in the wrong way?
-
- 15 min
- Deep Reckonings
- 2018
Repository of explicitly marked deepfake videos in which controversial public figures own up to past mistakes, aggressions, or crimes.
- Deep Reckonings
- 2018
-
- 15 min
- Deep Reckonings
- 2018
Deep Reckonings
Repository of explicitly marked deepfake videos in which controversial public figures own up to past mistakes, aggressions, or crimes.
Can you identify a fake video from a real one? Read the “About” tab to learn more about the motivations for this project. What is your response to their guiding question: “how might we use our synthetic selves to elicit our better angels”?
-
- 10 min
- n/a
- 2018
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
- n/a
- 2018
-
- 10 min
- n/a
- 2018
In Event of Moon Disaster
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
How do you foresee this type of misinformation being related to conspiracy theories? Do you believe you could have spotted the deepfake were you not specifically looking out for it? Are we approaching a future where we may have to watch all media with such scrutiny?
-
- 5 min
- The Guardian
- 2021
Amazon’s Ring devices are creating a private network of video surveillance that can be accessed by governments and other public entities without a warrant.
- The Guardian
- 2021
-
- 5 min
- The Guardian
- 2021
Amazon’s Ring is the largest civilian surveillance network the US has ever seen
Amazon’s Ring devices are creating a private network of video surveillance that can be accessed by governments and other public entities without a warrant.
How might home security devices impact citizenship? What are the risks of a ubiquitous deployment of home surveillance systems? How does this narrative demonstrate the compounding of human and machine biases?