Themes (326)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 15 min
- Kinolab
- 2019
Danny, Karl, and Theo are a trio of friends who all once lived together. In their adult years, after Danny and Theo have married, Karl gifts Danny with the most recent release of their favorite fighting video game, Striking Vipers X. In this virtual reality simulation, Danny and Karl are able to become their avatars, Lance and Roxette respectively, and feel pain and pleasure in the virtual world through them. After the avatars begin to form an intimate connection within the virtual reality video game, Danny, Theo, and Karl find themselves needing to negotiate new terms of their relationships with one another, struggling to find if connections in the virtual world can coexist with connections in the real world.
- Kinolab
- 2019
Relationships and Exploration of Identity in Virtual Worlds
Danny, Karl, and Theo are a trio of friends who all once lived together. In their adult years, after Danny and Theo have married, Karl gifts Danny with the most recent release of their favorite fighting video game, Striking Vipers X. In this virtual reality simulation, Danny and Karl are able to become their avatars, Lance and Roxette respectively, and feel pain and pleasure in the virtual world through them. After the avatars begin to form an intimate connection within the virtual reality video game, Danny, Theo, and Karl find themselves needing to negotiate new terms of their relationships with one another, struggling to find if connections in the virtual world can coexist with connections in the real world.
How do virtual reality worlds allow humans to explore aspects of their sexual or gender identities that they may not have the opportunity to discover in the real world? Do the seemingly limitless possibilities for connection with digital technologies and virtual realities innately pose a threat to the landscape of long term relationships? Should concepts such as sex be built into virtual realities? If it is assumed that the reality is fake and the avatars are distinct from their controllers, do Danny’s actions count as infidelity?
-
- 15 min
- Kinolab
- 2017
In a world in which the program Coach determines the pairing and duration of romantic matches, Frank and Amy managed to be matched more than once and eventually fall in love after failed matches with other people. After Frank breaks a promise to Amy by checking the expiry date that is automatically assigned to all relationships, they temporarily break up. After a reunion, they set out to discover the truth of their reality and the meaning of their match.
- Kinolab
- 2017
Online Dating Algorithms
In a world in which the program Coach determines the pairing and duration of romantic matches, Frank and Amy managed to be matched more than once and eventually fall in love after failed matches with other people. After Frank breaks a promise to Amy by checking the expiry date that is automatically assigned to all relationships, they temporarily break up. After a reunion, they set out to discover the truth of their reality and the meaning of their match.
Should machine learning algorithms, even the most sophisticated ones, be trusted when it comes to deeply emotional matters like love? Can simulations and algorithms account for everything when it comes to a person’s experience of love? How could algorithmic bias which is present in real-life matching programs enter the virtual reality system shown here? How can advanced simulations be distinguished from reality? Has the digital age moved the dating experience firmly past the “old days” of falling in love, and should this be embraced?
-
- 11 min
- Kinolab
- 2017
Museum curator Rolo shows off the exhibition of Clayton, a former death row inmate whose consciousness became digital during one of Rolo’s experiments. Despite evidence of his innocence, Clayton was put to death, and his digitally immortal consciousness was subjected to torture inside Rolo’s museum, with guests being able to simulate the electric chair shock on the holographic Clayton and eventually putting him in a conscious but vegetative state. Clayton’s daughter Nish shows up to settle the score, trapping Rolo in an eternal state of torture in a small digital device.
- Kinolab
- 2017
Technological Tortures and Traps
Museum curator Rolo shows off the exhibition of Clayton, a former death row inmate whose consciousness became digital during one of Rolo’s experiments. Despite evidence of his innocence, Clayton was put to death, and his digitally immortal consciousness was subjected to torture inside Rolo’s museum, with guests being able to simulate the electric chair shock on the holographic Clayton and eventually putting him in a conscious but vegetative state. Clayton’s daughter Nish shows up to settle the score, trapping Rolo in an eternal state of torture in a small digital device.
How can one protect their digital consciousness after they pass away? Can anyone ever be fully trusted to handle codes or programs that represent someone else’s existence or consciousness? How does the existence of racial bias and violence make the concept of eternal digital consciousnesses far more harrowing?
-
- 6 min
- Kinolab
- 2017
After his wife Carrie dies, Jack originally has her consciousness uploaded to his own brain as code. Once this solution is deemed unworkable, he has the coding of her consciousness transferred into a digital monkey toy which is gifted to their son Parker so that Carrie can continue to spend time with him. However, Carrie can only communicate in a binary manner, having access to only 2 phrases to express happiness or unhappiness.
- Kinolab
- 2017
Digitally Immortal Vessels and Eternity
After his wife Carrie dies, Jack originally has her consciousness uploaded to his own brain as code. Once this solution is deemed unworkable, he has the coding of her consciousness transferred into a digital monkey toy which is gifted to their son Parker so that Carrie can continue to spend time with him. However, Carrie can only communicate in a binary manner, having access to only 2 phrases to express happiness or unhappiness.
How can developers of digital immortality technology ensure that it is ethical from the get-go? Can something like this ever be “piloted” when lives are at stake? How can people ensure that digital lives do not last for true eternity, especially if those existences are mundane? How can humans keep control of their existences in concepts such as this?
-
- 9 min
- Kinolab
- 2017
In a short vignette told by a museum curator, a doctor known as Dawson devises a brain-computer interface device which can allow him to feel the physical sensations of patients in order to deliver a quicker diagnosis. However, his ownership of this technology ends up bizarrely shaping his psychology, putting himself and others in danger.
- Kinolab
- 2017
Consequences of Digital Hyperempathy
In a short vignette told by a museum curator, a doctor known as Dawson devises a brain-computer interface device which can allow him to feel the physical sensations of patients in order to deliver a quicker diagnosis. However, his ownership of this technology ends up bizarrely shaping his psychology, putting himself and others in danger.
How should technology which allows us to feel the sensations of others be regulated? What are the pros and cons of hyperempathy technology? How can autonomy over one’s own body be assured when technology like this exists?
-
- 12 min
- Kinolab
- 1968
See HAL Part I for further context. In this narrative, astronauts Dave and Frank begin to suspect that the AI which runs their ship, HAL, is malfunctioning and must be shut down. While they try to hide this conversation from HAL, he becomes aware of their plan anyway and attempts to protect himself so that the Discovery mission in space is not jeopardized. He does so by causing chaos on the ship, leveraging his connections to an internet of things to place the crew in danger. Eventually, Dave proceeds with his plan to shut HAL down, despite HAL’s protestations and desire to stay alive.
- Kinolab
- 1968
HAL Part II: Vengeful AI, Digital Murder, and System Failures
See HAL Part I for further context. In this narrative, astronauts Dave and Frank begin to suspect that the AI which runs their ship, HAL, is malfunctioning and must be shut down. While they try to hide this conversation from HAL, he becomes aware of their plan anyway and attempts to protect himself so that the Discovery mission in space is not jeopardized. He does so by causing chaos on the ship, leveraging his connections to an internet of things to place the crew in danger. Eventually, Dave proceeds with his plan to shut HAL down, despite HAL’s protestations and desire to stay alive.
Can AI have lives of their own which humans should respect? Is it considered “murder” if a human deactivates an AI against their will, even if this “will” to live is programmed by another human? What are the ethical implications of removing the “high brain function” of an AI and leaving just the rote task programming? Is this a form of murder too? How can secrets be kept private from an AI, especially if people fail to understand all the capabilities of the machine?