Themes (326)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- n/a
- 2018
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
- n/a
- 2018
-
- 7 min
- n/a
- 2018
Stealing Ur Feelings
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
Did you feel your results truly reflected your perception of yourself? What are the consequences of a machine missing all sorts of nuances and labelling you in the wrong way?
-
- 10 min
- Field of Vision
- 2017
Video criticising the AT&T and NSA partnership, which allowed the NSA to spy on the UN, the World Bank, etc, by installing its surveillance equipment in AT&T hubs.
- Field of Vision
- 2017
-
- 10 min
- Field of Vision
- 2017
Project X: Field of Vision
Video criticising the AT&T and NSA partnership, which allowed the NSA to spy on the UN, the World Bank, etc, by installing its surveillance equipment in AT&T hubs.
How do you know that your conversations are not being spied on at the moment? Would you like to have more reassurance?
-
- 15 min
- n/a
- 2017
An exploration of Venmo transactions for five different randomly selected users. On Venmo, all transactions are public by default, and users can take certain steps to make this information private.
- n/a
- 2017
-
- 15 min
- n/a
- 2017
Public by Default: The Tales of Venmo
An exploration of Venmo transactions for five different randomly selected users. On Venmo, all transactions are public by default, and users can take certain steps to make this information private.
If you have Venmo, do you have in on private or public mode? Have you ever thought about what having your transactions publicised could lead to? Who could use information like what you are hungry for or your relationships to other people, and how?
-
- 7 min
- TED
- 2017
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
- TED
- 2017
-
- 7 min
- TED
- 2017
Justice in the Age of Big Data
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
Should we leave policing to algorithms? Can any “perfect” algorithm for policing be created? How can police departments and software companies be held accountable for masquerading bias as the objectivity of an algorithm?
-
- 7 min
- Wired
- 2019
Internet users should start considering private browsers such as Duckduckgo to promote privacy and prevent personalized search results and ads. Many different pieces of software, including browsers by larger tech companies, are beginning to take this approach of erasing data, blocking outside tracking, or preventing cookies.
- Wired
- 2019
-
- 7 min
- Wired
- 2019
It’s Time to Switch to a Privacy Browser
Internet users should start considering private browsers such as Duckduckgo to promote privacy and prevent personalized search results and ads. Many different pieces of software, including browsers by larger tech companies, are beginning to take this approach of erasing data, blocking outside tracking, or preventing cookies.
Consider if the privacy-oriented browsers described in the article were the default. Whose interests would this work towards? Whose interests would this work against?
-
- 5 min
- The New York Times
- 2019
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
- The New York Times
- 2019
-
- 5 min
- The New York Times
- 2019
How Biometrics Makes You Safer
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
How does facial recognition technology facilitate challenging investigations? Do you believe police use of facial recognition is as transparent and pure as this article makes it seem? Where could bias enter this system of using facial recognition technology?