Privacy (137)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- The New York Times
- 2019
Biometric facial recognition software, specifically that used with arrest photos in the NYPD, makes extensive use of children’s arrest photos despite a far lower accuracy rate.
- The New York Times
- 2019
-
- 7 min
- The New York Times
- 2019
She Was Arrested at 14. Then Her Photo Went to a Biometrics Database
Biometric facial recognition software, specifically that used with arrest photos in the NYPD, makes extensive use of children’s arrest photos despite a far lower accuracy rate.
How can machine learning algorithms cause inequality to compound? Would it be better practice to try to make facial recognition equitable across all populations, or to abandon its use in law enforcement altogether, as some cities like Oakland have done?
-
- 5 min
- MIT Technology Review
- 2019
In the case of the New Orleans Police Department, along with other cities, data used to train predictive crime algorithms was inconsistent and “dirty” to begin with, making the results disproportionately targeted toward disadvantaged communities.
- MIT Technology Review
- 2019
-
- 5 min
- MIT Technology Review
- 2019
Police across the US are training crime-predicting AIs on falsified data
In the case of the New Orleans Police Department, along with other cities, data used to train predictive crime algorithms was inconsistent and “dirty” to begin with, making the results disproportionately targeted toward disadvantaged communities.
If the data which we train algorithms with is inherently biased, then can we truly ever get a “fair” algorithm? Can AI programs ever solve or remove human bias? What might happen if machines make important criminal justice decisions, such as sentence lengths?
-
- 10 min
- n/a
- 2018
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
- n/a
- 2018
-
- 10 min
- n/a
- 2018
In Event of Moon Disaster
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
How do you foresee this type of misinformation being related to conspiracy theories? Do you believe you could have spotted the deepfake were you not specifically looking out for it? Are we approaching a future where we may have to watch all media with such scrutiny?
-
- 15 min
- n/a
- 2018
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
- n/a
- 2018
-
- 15 min
- n/a
- 2018
Choose Your Own Fake News
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
How can you be less vulnerable to fake news and fake advertising online?
-
- 7 min
- n/a
- 2018
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
- n/a
- 2018
-
- 7 min
- n/a
- 2018
Stealing Ur Feelings
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
Did you feel your results truly reflected your perception of yourself? What are the consequences of a machine missing all sorts of nuances and labelling you in the wrong way?
-
- 10 min
- Field of Vision
- 2017
Video criticising the AT&T and NSA partnership, which allowed the NSA to spy on the UN, the World Bank, etc, by installing its surveillance equipment in AT&T hubs.
- Field of Vision
- 2017
-
- 10 min
- Field of Vision
- 2017
Project X: Field of Vision
Video criticising the AT&T and NSA partnership, which allowed the NSA to spy on the UN, the World Bank, etc, by installing its surveillance equipment in AT&T hubs.
How do you know that your conversations are not being spied on at the moment? Would you like to have more reassurance?