AI (138)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 15 min
- n/a
- 2018
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
- n/a
- 2018
-
- 15 min
- n/a
- 2018
Choose Your Own Fake News
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
How can you be less vulnerable to fake news and fake advertising online?
-
- 6 min
- n/a
- 2018
Through a series of interactions on a chat and a truth-or-dare type game, the user guesses if they are chatting with a bot or human.
- n/a
- 2018
-
- 6 min
- n/a
- 2018
Bot or Not?
Through a series of interactions on a chat and a truth-or-dare type game, the user guesses if they are chatting with a bot or human.
Are you able to tell the difference between interacting with a bot or human? How? What indicators did you rely on to make your decision?
-
- 15 min
- Hidden Switch
- 2018
A hands-on learning experience about the algorithms used in dating apps through the perspective of a created monster avatar.
- Hidden Switch
- 2018
-
- 15 min
- Hidden Switch
- 2018
Monster Match
A hands-on learning experience about the algorithms used in dating apps through the perspective of a created monster avatar.
How do algorithms in dating apps work? What gaps seemed most prominent to you? What upset you most about the way this algorithm defined you and the choices it offered to you?
-
- 10 min
- Survival of the Best Fit
- 2018
Explores hiring bias of AI by playing a game in which you are the hiring manager.
- Survival of the Best Fit
- 2018
-
- 10 min
- Survival of the Best Fit
- 2018
Survival of the Best Fit
Explores hiring bias of AI by playing a game in which you are the hiring manager.
How does it feel to be in the situation in which you have inserted the bias into the algorithm? What steps do you feel must be taken to ensure algorithms are trained in a less hasty manner?
-
- 7 min
- n/a
- 2018
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
- n/a
- 2018
-
- 7 min
- n/a
- 2018
Stealing Ur Feelings
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
Did you feel your results truly reflected your perception of yourself? What are the consequences of a machine missing all sorts of nuances and labelling you in the wrong way?
-
- 7 min
- TED
- 2017
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
- TED
- 2017
-
- 7 min
- TED
- 2017
Justice in the Age of Big Data
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
Should we leave policing to algorithms? Can any “perfect” algorithm for policing be created? How can police departments and software companies be held accountable for masquerading bias as the objectivity of an algorithm?