Possibility of technologies such as AI developing human emotions and questions of AI rights
AI Emotions and Rights (37)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
- Kinolab
- 2016
Maeve Part I: Sex, Sentience, and Subservience of Humanoid Entertainment Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
Could AI develop to have a subconscious, or should they be limited to only doing what humans instruct them to do? What if robots did gain a sense or a repulsion to the way in which humans treat them? How does the nature of ‘sex’ change when robots are used instead of humans? What are the ethical issues of creating a humanoid robot who is designed to give consent regardless of treatment? Can AI contemplate their own mortality? At that point, do they become ‘living beings’? If robots are only programmed for certain scenarios or certain personalities, what happens if they break free of that false reality?
-
- 9 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
- Kinolab
- 2016
Humanity and Consciousness of Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
Can humans and AI develop significant, intimate relationships? If AI can learn the rules of human interaction indistinguishably well, does it matter if we can’t tell the difference between AI and human? What does it mean to have ‘consciousness’? Is programmed consciousness significantly different than biological consciousness? Is it morally acceptable to create humanoid robots with no choice for consciousness all in the name of making humans feel more powerful?
-
- 5 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
- Kinolab
- 2016
Expendability vs. Emotional Connection in Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
Can AI be programmed to feel like it is ‘human’? If AI can form attachments to things or people through programming, is that attachment considered “real”? What are the ethical questions involved with how humans treat advanced AI? Does human morality apply to ‘virtual’ experiences/games? Do human actions in digital realms/against digital beings reveal something about their humanity? Is it important to program consequences into digital environments, even if they come at the expense of total “freedom” for users?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
- Kinolab
- 2016
Maeve Part II: Robot Consciousness and Parameters of Robotic Life
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
What should the relationship be like between advanced AI and their human creators? Can advanced AI be considered independent agents? Are human thoughts any more abstract or improvised than the visualisation of Maeve’s memories? What is the fundamental difference between being born and being made? Should AI/robots be able to ‘know’ about their own creation and existence? Should robots have the ability to “live without limits” like humans can, and do they even have the capability to be programmed in such a way?
-
- 7 min
- Kinolab
- 2014
Matt tells Joe Potter about how he used to train uploaded consciousnesses to take care of people’s homes. After somebody’s brain is copied and uploaded onto a cookie, the copy is often unwilling to perform the menial tasks asked of them. However, once the consciousness is inside the “cookie,” time can be manipulated however the real people see fit in order to coerce cooperation for the coded digital consciousness.
- Kinolab
- 2014
Repetitive Code as a Menial Laborer
Matt tells Joe Potter about how he used to train uploaded consciousnesses to take care of people’s homes. After somebody’s brain is copied and uploaded onto a cookie, the copy is often unwilling to perform the menial tasks asked of them. However, once the consciousness is inside the “cookie,” time can be manipulated however the real people see fit in order to coerce cooperation for the coded digital consciousness.
Can we upload consciousness in order to make our lives easier? How do we ethically treat a digital consciousness? How can digital beings be put to good use in our lives? As AI potentially become more humanoid, is it justifiable to continue assigning them long repetitive tasks?
-
- 6 min
- Kinolab
- 2017
Robert Daly is a programmer at the company Callister, which developed the immersive virtual reality game Infinity and its community for the entertainment of users. Daly is typically seen in the shadow of the co-founder of the company, the charismatic James Walton. Unbeknownst to anyone else, Daly possesses a personal modification of the Infinity game program, where he is able to upload sentient digital clones of his co-workers to take out his frustrations upon, as he does with Walton in this narrative.
- Kinolab
- 2017
Virtual Vindictiveness and Simulated Clones Part I: Daly and Walton
Robert Daly is a programmer at the company Callister, which developed the immersive virtual reality game Infinity and its community for the entertainment of users. Daly is typically seen in the shadow of the co-founder of the company, the charismatic James Walton. Unbeknownst to anyone else, Daly possesses a personal modification of the Infinity game program, where he is able to upload sentient digital clones of his co-workers to take out his frustrations upon, as he does with Walton in this narrative.
What should the ethical boundaries be in terms of creating digital copies of real-life people to manipulate in virtual realities? How would this alter the perception of autonomy or entitlement? Should the capability to create exact digital likenesses of real people be created for any reason? If so, how should their autonomy be ensured, since they are technically a piece of programming? Are digital copies of a person entitled to the same rights that their corporeal selves have?