AI (124)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- Kinolab
- 2013
In this film, actress Robin Wright plays a fictionalized version of herself as an actress whose popularity is declining. Her agent Al exposes her to deep fake technology which creates a virtual version of an actor to play a role in any number of scenarios or films. These “actors” are 3D holographs with AI that have been trained to replicate the real person which they imitate. However, Robin is disconcerted with the lack of agency that she would have in deciding how her image and identity appeared in these movies.
- Kinolab
- 2013
Digital Performers and the Gift of Choice
In this film, actress Robin Wright plays a fictionalized version of herself as an actress whose popularity is declining. Her agent Al exposes her to deep fake technology which creates a virtual version of an actor to play a role in any number of scenarios or films. These “actors” are 3D holographs with AI that have been trained to replicate the real person which they imitate. However, Robin is disconcerted with the lack of agency that she would have in deciding how her image and identity appeared in these movies.
What sorts of problems are implicated with the ability to manipulate another person’s body and likeness in a piece of media without their consent? Does technology like this actually have the potential to free actors from some of the constraints of the film industry, as Al says? How would acting be valued as an art, and actors paid accordingly and properly, if this technology became the norm?
-
- 16 min
- Kinolab
- 2004
Joel Barish recently broke up with Clementine, his girlfriend of two years, in a brutal argument. After discovering that she has used a procedure known as Lacuna to erase him from her memories, Joel decides to undergo the same procedure to forget that he ever knew Clementine. The procedure uses a brain-computer interface to map the areas of Joel’s brain that are active whenever he has a memory of Clementine, first when he is awake and using associated objects to perform active recall and then when he is asleep and subconsciously remembering her. Despite Joel’s eventual regrets and desperate attempts to remember Clementine, the procedure is successful, and he forgets her. However, Joel and Clementine reunite in the real world after their respective procedures, and as they have a fresh start, they end up listening to Clementine’s tape from before the procedure where she dissects all of the flaws of Joel and their relationship.
- Kinolab
- 2004
Digital Memory Erasure and Brain Mapping
Joel Barish recently broke up with Clementine, his girlfriend of two years, in a brutal argument. After discovering that she has used a procedure known as Lacuna to erase him from her memories, Joel decides to undergo the same procedure to forget that he ever knew Clementine. The procedure uses a brain-computer interface to map the areas of Joel’s brain that are active whenever he has a memory of Clementine, first when he is awake and using associated objects to perform active recall and then when he is asleep and subconsciously remembering her. Despite Joel’s eventual regrets and desperate attempts to remember Clementine, the procedure is successful, and he forgets her. However, Joel and Clementine reunite in the real world after their respective procedures, and as they have a fresh start, they end up listening to Clementine’s tape from before the procedure where she dissects all of the flaws of Joel and their relationship.
Is it possible to completely forget and event or a person in the digital age, or is there always the possibility that traces will remain? Do digital technologies hold memories well enough, or is there something more abstract about these memories that they cannot capture? How could the technology displayed here be abused? Does pervasive digital memory of people and events ever allow us to feel completely neutral about another person, and is the a departure from the pre-digital age? Do humans have an over-reliance on digital memory? How have relationships changed with the advent of digital memory?
-
- 10 min
- New York Times
- 2019
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
- New York Times
- 2019
-
- 10 min
- New York Times
- 2019
As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
What are the consequences of employing biased technologies to survey citizens? Who loses agency, and who gains agency?
-
- 27 min
- Cornell Tech
- 2019
Podcast about worker quantification in factors such as hiring, productivity and more. Dives into the discussion on why we should attempt a fair making of algorithms. Warns specifically about how algorithms can find “proxy variables” to approximate for cultural fits like race or gender even when the algorithms is supposedly controlled for these factors.
- Cornell Tech
- 2019
Quantifying Workers
Podcast about worker quantification in factors such as hiring, productivity and more. Dives into the discussion on why we should attempt a fair making of algorithms. Warns specifically about how algorithms can find “proxy variables” to approximate for cultural fits like race or gender even when the algorithms is supposedly controlled for these factors.
What are the dangers of having an algorithm involved in the hiring process? Is efficiency worth the cost in this scenario? Can humans ever be placed in a binary context?
-
- 10 min
- Survival of the Best Fit
- 2018
Explores hiring bias of AI by playing a game in which you are the hiring manager.
- Survival of the Best Fit
- 2018
-
- 10 min
- Survival of the Best Fit
- 2018
Survival of the Best Fit
Explores hiring bias of AI by playing a game in which you are the hiring manager.
How does it feel to be in the situation in which you have inserted the bias into the algorithm? What steps do you feel must be taken to ensure algorithms are trained in a less hasty manner?
-
- 7 min
- n/a
- 2018
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
- n/a
- 2018
-
- 7 min
- n/a
- 2018
Stealing Ur Feelings
Exploration of how, through facial and emotion recognition, digital artifacts make decisions on what we may want or need, and what they are able to do with this data.
Did you feel your results truly reflected your perception of yourself? What are the consequences of a machine missing all sorts of nuances and labelling you in the wrong way?