Human Control of Technology (67)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 9 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
- Kinolab
- 2016
Humanity and Consciousness of Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
Can humans and AI develop significant, intimate relationships? If AI can learn the rules of human interaction indistinguishably well, does it matter if we can’t tell the difference between AI and human? What does it mean to have ‘consciousness’? Is programmed consciousness significantly different than biological consciousness? Is it morally acceptable to create humanoid robots with no choice for consciousness all in the name of making humans feel more powerful?
-
- 5 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
- Kinolab
- 2016
Expendability vs. Emotional Connection in Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
Can AI be programmed to feel like it is ‘human’? If AI can form attachments to things or people through programming, is that attachment considered “real”? What are the ethical questions involved with how humans treat advanced AI? Does human morality apply to ‘virtual’ experiences/games? Do human actions in digital realms/against digital beings reveal something about their humanity? Is it important to program consequences into digital environments, even if they come at the expense of total “freedom” for users?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
- Kinolab
- 2016
Maeve Part II: Robot Consciousness and Parameters of Robotic Life
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
What should the relationship be like between advanced AI and their human creators? Can advanced AI be considered independent agents? Are human thoughts any more abstract or improvised than the visualisation of Maeve’s memories? What is the fundamental difference between being born and being made? Should AI/robots be able to ‘know’ about their own creation and existence? Should robots have the ability to “live without limits” like humans can, and do they even have the capability to be programmed in such a way?
-
- 12 min
- Kinolab
- 2011
In the 2050s, humans are able to connect their brains to an implanted digital device known as a “grain,” which stores all of their individual audiovisual memories and allows for instant replays or closer analysis of any stored memories. Liam Foxwell, one such user, discusses these devices with some friends at dinner, and later uses the data collected at this party to scrutinize his wife’s interactions with Jonas, a crude man who uses the grain for contemptible purposes. With these memories, he confronts his wife and demands objective truth from her.
- Kinolab
- 2011
Digital Memory, Stored Interactions, and the Inability to Forget
In the 2050s, humans are able to connect their brains to an implanted digital device known as a “grain,” which stores all of their individual audiovisual memories and allows for instant replays or closer analysis of any stored memories. Liam Foxwell, one such user, discusses these devices with some friends at dinner, and later uses the data collected at this party to scrutinize his wife’s interactions with Jonas, a crude man who uses the grain for contemptible purposes. With these memories, he confronts his wife and demands objective truth from her.
What are the consequences of combining the fallibility of human memory with the precision of digital technology? How does over-analysis of digitally stored memories or interactions lead to anxiety or conflict in the real world? What are the dangers of placing our personal memories into a context where they can be stolen or hacked or sold? In the digital age, is anyone truly able to forget anything? How is human judgement and agency impacted by digital memory?
-
- 7 min
- Kinolab
- 2014
Matt tells Joe Potter about how he used to train uploaded consciousnesses to take care of people’s homes. After somebody’s brain is copied and uploaded onto a cookie, the copy is often unwilling to perform the menial tasks asked of them. However, once the consciousness is inside the “cookie,” time can be manipulated however the real people see fit in order to coerce cooperation for the coded digital consciousness.
- Kinolab
- 2014
Repetitive Code as a Menial Laborer
Matt tells Joe Potter about how he used to train uploaded consciousnesses to take care of people’s homes. After somebody’s brain is copied and uploaded onto a cookie, the copy is often unwilling to perform the menial tasks asked of them. However, once the consciousness is inside the “cookie,” time can be manipulated however the real people see fit in order to coerce cooperation for the coded digital consciousness.
Can we upload consciousness in order to make our lives easier? How do we ethically treat a digital consciousness? How can digital beings be put to good use in our lives? As AI potentially become more humanoid, is it justifiable to continue assigning them long repetitive tasks?
-
- 16 min
- Kinolab
- 2016
In a world in which social media is constantly visible, and in which the averaged five star rating for each person based on every single one of their interactions with others are displayed, Lacie tries to move into the higher echelons of society. She does this by consistently keeping up saccharine appearances in real life and on her social media feed because everyone is constantly connected to this technology. En route to an important wedding, she loses several points in her rating, yet still finds solace with a truck driver who offers her a ride. After releasing her true emotions at the wedding (from which she was ultimately disinvited for her low score), she is jailed and continues the release of her pent-up emotions. For further reading and real-life connections, see the narrative “Inside China’s Vast New Experiment in Social Ranking.”
- Kinolab
- 2016
Lacie Part II: Everyday Influencers and “Keep Instagram Casual”
In a world in which social media is constantly visible, and in which the averaged five star rating for each person based on every single one of their interactions with others are displayed, Lacie tries to move into the higher echelons of society. She does this by consistently keeping up saccharine appearances in real life and on her social media feed because everyone is constantly connected to this technology. En route to an important wedding, she loses several points in her rating, yet still finds solace with a truck driver who offers her a ride. After releasing her true emotions at the wedding (from which she was ultimately disinvited for her low score), she is jailed and continues the release of her pent-up emotions. For further reading and real-life connections, see the narrative “Inside China’s Vast New Experiment in Social Ranking.”
Are shallow interactions and the improbability of someone saying what they truly mean on a platform inherent to the design of digital social networks? How does social media put pressure on people to change events or relationships in their real life to keep up positive appearances? Consider movements such as “Keep Instagram Casual,” which implores users to post whatever they like, whenever they like, rather than being beholden to strict societal rules on what is acceptable to post. Can this occur with a user-centric push, or does something about the design of the platforms need to change? Does “digital niceness” actually benefit anyone? How do figures such as influencers take advantage of digital platforms to set social norms online and offline?