Robots (54)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 6 min
- Kinolab
- 2019
Eleanor Shellstrop runs a fake afterlife, in which she conducts an experiment to prove that humans with low ethical sensibility can improve themselves. One of the subjects, Simone, is in deep denial upon arriving in this afterlife, and does as she pleases after convincing herself that nothing is real. Elsewhere, another conductor of the experiment, Jason, kills a robot which has been taunting him since the advent of the experiment.
- Kinolab
- 2019
Resisting Realities and Robotic Murder
Eleanor Shellstrop runs a fake afterlife, in which she conducts an experiment to prove that humans with low ethical sensibility can improve themselves. One of the subjects, Simone, is in deep denial upon arriving in this afterlife, and does as she pleases after convincing herself that nothing is real. Elsewhere, another conductor of the experiment, Jason, kills a robot which has been taunting him since the advent of the experiment.
What are the pros and cons of solipsism as a philosophy? Does it pose a danger of making us act immorally? How can we apply the risk of solipsism to technology such as virtual reality– a space where we know nothing is real except our own feelings and perceptions. Should virtual reality have ethical rules to prevent solipsism from brewing in it? Could that leak into our daily lives as well?
Is it ethical for humans to kill AI beings in fits of negative emotions, such as jealousy? Should this be able to happen on a whim? Should humans have total control of whether AI beings live or die?
-
- 7 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Dolores, one of these hosts, begins to fall in love with William, a human visitor, and he reciprocates those feelings as he expresses his unhappiness with a planned marriage waiting for him in the real world outside the park. After Dolores is initially angry, she nonetheless rejoins forces with William to search for a place beyond the theme-park Western reality that she has always known.
- Kinolab
- 2016
Relationships and Escapism with AI
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Dolores, one of these hosts, begins to fall in love with William, a human visitor, and he reciprocates those feelings as he expresses his unhappiness with a planned marriage waiting for him in the real world outside the park. After Dolores is initially angry, she nonetheless rejoins forces with William to search for a place beyond the theme-park Western reality that she has always known.
Is William’s love for Dolores ‘true’ love, or is it impossible for a human to truly love an AI and vice versa? If AI are programmed to feel emotions, can their love be equally as real as human love? What issues may arise if robots become a means through which humans escape their real life problems and complicated relationships? What are the potential consequences for both robots and people if robots escape the scenario for which they were specifically engineered, and try to live a life in the real world? Should this be allowed?
-
- 14 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at the park, recently oversaw an update to add “reveries,” or slight fake memories, into the coding of the robots to make them seem more human. However, members of the board overseeing the park demonstrate that these reveries can sometimes lead robots to remember and “hold grudges” even after they have been asked to erase their own memory, something that can lead to violent tendencies. Later, as Bernard and Theresa snoop on Ford, the director of the park, they learn shocking information, and a robot once again becomes a violent tool as Ford murders Theresa.
- Kinolab
- 2016
AI Memories and Self-Identification
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at the park, recently oversaw an update to add “reveries,” or slight fake memories, into the coding of the robots to make them seem more human. However, members of the board overseeing the park demonstrate that these reveries can sometimes lead robots to remember and “hold grudges” even after they have been asked to erase their own memory, something that can lead to violent tendencies. Later, as Bernard and Theresa snoop on Ford, the director of the park, they learn shocking information, and a robot once again becomes a violent tool as Ford murders Theresa.
Is ‘memory’ uniquely human? What is the role of memory in creating advanced AI consciousness? Does memory of trauma/suffering ultimately create AI that are hostile to humans? Even if we had the technological means to give AI emotions and memory, should we? And if we do, what ethics and morals must we follow to prevent traumatic memory, such as uploading memories of a fake dead son into Bernard? How can androids which are programmed to follow the directions of one person be used for violent ends? If robots are programmed to not hurt humans, how are they supposed to protect themselves from bad actors, especially if they believe themselves human? Should humans create humanoid replicant robots that do not possess any inherently negative human traits, such as anxiety?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Lutz helps Maeve in her resistance against tyrannical rule over robots by altering her core code, allowing her to access capabilities previous unavailable to other hosts such as the ability to harm humans and the ability to control other robotic hosts.
- Kinolab
- 2016
Maeve Part III: Robot Resistance and Empowerment
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Lutz helps Maeve in her resistance against tyrannical rule over robots by altering her core code, allowing her to access capabilities previous unavailable to other hosts such as the ability to harm humans and the ability to control other robotic hosts.
Should robots be given a fighting chance to be able to resemble humans, especially in terms of fighting for their own autonomy? Should robots ever be left in charge of other robots? How could this promote a tribalism which is dangerous to humans? Can robots develop their own personality, or does everything simply come down to coding, and which way is “better”?
-
- 3 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, a humanoid robot who previously believed himself to be a regular human, questions his maker, Ford, on what makes him different from humans, to which Ford replies that the line is very thin and arbitrary.
- Kinolab
- 2016
Robot Consciousness
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, a humanoid robot who previously believed himself to be a regular human, questions his maker, Ford, on what makes him different from humans, to which Ford replies that the line is very thin and arbitrary.
Why do humans cling to ‘consciousness’ as the thing that separates us from advanced machines? Is consciousness real or imagined, and if it is constructed in the mind, can it be replicated in AI’s ‘mind programming’? Would that be a same or different kind of consciousness? Should robots be given the capability for consciousness or self-actualization if that leads to tangible pain, for example in the form of a tragic backstory? If robots are to have consciousness, do they need to be able to essentially act like a human in every other way?
-
- 3 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Tom, the son of the Feed’s creator Lawrence, realizes that his best friend Max is a robot of sorts, posing as a human. In reality, the body in the tub is a host which contains the digital consciousness of Max, formerly uploaded to a cloud through his feed and then downloaded into this new body. The new version of Max debates with Tom about why he should be considered a true human being.
- Kinolab
- 2019
Digital Cloning
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Tom, the son of the Feed’s creator Lawrence, realizes that his best friend Max is a robot of sorts, posing as a human. In reality, the body in the tub is a host which contains the digital consciousness of Max, formerly uploaded to a cloud through his feed and then downloaded into this new body. The new version of Max debates with Tom about why he should be considered a true human being.
If having brain-computer interfaces collect millions of data points from each person, including memories, means that the life of each person can be extended if they die prematurely, is this worth the cost? Is this a true “life”? Whose viewpoint do you agree with more in this narrative?