Privacy (134)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 10 min
- n/a
- 2018
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
- n/a
- 2018
-
- 10 min
- n/a
- 2018
In Event of Moon Disaster
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
How do you foresee this type of misinformation being related to conspiracy theories? Do you believe you could have spotted the deepfake were you not specifically looking out for it? Are we approaching a future where we may have to watch all media with such scrutiny?
-
- 15 min
- n/a
- 2018
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
- n/a
- 2018
-
- 15 min
- n/a
- 2018
Choose Your Own Fake News
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
How can you be less vulnerable to fake news and fake advertising online?
-
- 7 min
- n/a
- 2018
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
- n/a
- 2018
-
- 7 min
- n/a
- 2018
Stealing Ur Feelings
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
Did you feel your results truly reflected your perception of yourself? What are the consequences of a machine missing all sorts of nuances and labelling you in the wrong way?
-
- 10 min
- Field of Vision
- 2017
Video criticising the AT&T and NSA partnership, which allowed the NSA to spy on the UN, the World Bank, etc, by installing its surveillance equipment in AT&T hubs.
- Field of Vision
- 2017
-
- 10 min
- Field of Vision
- 2017
Project X: Field of Vision
Video criticising the AT&T and NSA partnership, which allowed the NSA to spy on the UN, the World Bank, etc, by installing its surveillance equipment in AT&T hubs.
How do you know that your conversations are not being spied on at the moment? Would you like to have more reassurance?
-
- 15 min
- n/a
- 2017
An exploration of Venmo transactions for five different randomly selected users. On Venmo, all transactions are public by default, and users can take certain steps to make this information private.
- n/a
- 2017
-
- 15 min
- n/a
- 2017
Public by Default: The Tales of Venmo
An exploration of Venmo transactions for five different randomly selected users. On Venmo, all transactions are public by default, and users can take certain steps to make this information private.
If you have Venmo, do you have in on private or public mode? Have you ever thought about what having your transactions publicised could lead to? Who could use information like what you are hungry for or your relationships to other people, and how?
-
- 7 min
- TED
- 2017
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
- TED
- 2017
-
- 7 min
- TED
- 2017
Justice in the Age of Big Data
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
Should we leave policing to algorithms? Can any “perfect” algorithm for policing be created? How can police departments and software companies be held accountable for masquerading bias as the objectivity of an algorithm?