AI (124)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 9 min
- Kinolab
- 2013
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
- Kinolab
- 2013
Martha and Ash Part II: Digital Revival and Human Likeness in Hardware
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
How can memories be kept pure when robots are able to impersonate deceased loved ones? If programs and robots such as this can be created, do we truly own our own existence? How can artificial intelligence fail as therapy or companionship? Can artificial intelligence and robotics help comfort people who never even met the deceased? How should an artificial companion be handled by its administrator? Can an animated or robotic humanoid likeness of a person who seemingly has feelings be relegated to the attic as easily as other mementos can?
-
- 15 min
- Deep Reckonings
- 2018
Repository of explicitly marked deepfake videos in which controversial public figures own up to past mistakes, aggressions, or crimes.
- Deep Reckonings
- 2018
-
- 15 min
- Deep Reckonings
- 2018
Deep Reckonings
Repository of explicitly marked deepfake videos in which controversial public figures own up to past mistakes, aggressions, or crimes.
Can you identify a fake video from a real one? Read the “About” tab to learn more about the motivations for this project. What is your response to their guiding question: “how might we use our synthetic selves to elicit our better angels”?
-
- 7 min
- Kinolab
- 2013
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Through the chat bot, Ash essentially goes on living, as he is able to respond to Martha and grow as more memories are shared with the program.
- Kinolab
- 2013
Martha and Ash Part I: Digital Revival and Human Likeness in Software
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Through the chat bot, Ash essentially goes on living, as he is able to respond to Martha and grow as more memories are shared with the program.
How should programs like this be deployed? Who should be in charge of them? Do our online interactions abstract our entire personality? Could this be validly used for therapy purposes, or is any existence of such software dangerous? Is it ethical to provide such a tangible way of disconnecting from reality, and are these interactions truly all that different from something like social media interactions?
-
- 3 min
- Vimeo: Shalini Kantayya
- 2020
A brief visual example of an application of computer vision for facial recognition, how these algorithms can be trained to recognized faces, and the dangers that come with biased data sets, such as a disproportionate amount of white men.
- Vimeo: Shalini Kantayya
- 2020
Coded Bias: How Ignorance Enters Computer Vision
A brief visual example of an application of computer vision for facial recognition, how these algorithms can be trained to recognized faces, and the dangers that come with biased data sets, such as a disproportionate amount of white men.
When thinking about computer vision in relation to projects such as the Aspire Mirror, what sorts of individual and systemic consequences arise for those who have faces that biased computer vision programs do not easily recognize?
-
- 2 min
- The Verge
- 2019
In this very short narrative, the Social Media Addiction Reduction technology Act is presented in the context of social networks and concerns around digital addiction.
- The Verge
- 2019
-
- 2 min
- The Verge
- 2019
New bill would ban autoplay videos and endless scrolling
In this very short narrative, the Social Media Addiction Reduction technology Act is presented in the context of social networks and concerns around digital addiction.
How do they work and what are the risks of digital addiction mechanisms? How to regulate digital content that can generate addictions?
-
- 5 min
- MIT Technology Review
- 2021
The company Datagen serves as an example of a business which sells synthetic human faces (based on real scans) to other companies to use as training data for AI.
- MIT Technology Review
- 2021
-
- 5 min
- MIT Technology Review
- 2021
These creepy fake humans herald a new age in AI
The company Datagen serves as an example of a business which sells synthetic human faces (based on real scans) to other companies to use as training data for AI.
Does it seem likely that synthetic human data has the power to combat bias, or could it just introduce more bias? Does this represent putting too much trust in machines?