Mixed Reality (25)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 13 min
- Kinolab
- 2011
In this episode, Bing Madsen is one of many citizens who provide power to the digital world through spending each day on a stationery bike, which earns him “merits” to spend on both leisure activities and necessities. These laborers, along with all other classes, are constantly surrounded by screens in which their digital avatars can participate in virtual activities like biking on a road or being in a “live” studio audience. The reality competition show “Hot Shot” is one program streamed on these screens. In this narrative, Bing conspires to grab the attention of the world on stage, proclaiming that the whole digital world is fake and has brainwashed the laborers into providing power while upper classes get more leisure and enjoyment. This eventually lands him with his own talk show, where he recreates his suicide threats for sensational content in exchange for a more lucrative lifestyle.
- Kinolab
- 2011
Technological Immersion, Digital Underclasses, and Attention Economies
In this episode, Bing Madsen is one of many citizens who provide power to the digital world through spending each day on a stationery bike, which earns him “merits” to spend on both leisure activities and necessities. These laborers, along with all other classes, are constantly surrounded by screens in which their digital avatars can participate in virtual activities like biking on a road or being in a “live” studio audience. The reality competition show “Hot Shot” is one program streamed on these screens. In this narrative, Bing conspires to grab the attention of the world on stage, proclaiming that the whole digital world is fake and has brainwashed the laborers into providing power while upper classes get more leisure and enjoyment. This eventually lands him with his own talk show, where he recreates his suicide threats for sensational content in exchange for a more lucrative lifestyle.
How can technology be used/how is technology used to pacify the masses? What connection can you make to the society depicted here and the way that social media and other digital companies use data to make profits? How can digital technologies become a breeding ground for sensational content, and can this problem be fixed? Can anyone be “unplugged” and successful in our reality? How do internet communities commodify authenticity?
-
- 4 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, a worker at the park, stumbles across robotic hosts that are monitored and directed solely by the park’s director, Dr. Ford. Ford reveals that these robots are an enactment of nostalgic memories from his younger days that he wishes to preserve, especially since they were built by an old friend of his. Relates to digital memory and delaying forgetting.
- Kinolab
- 2016
Technology and the Tangibility of Human Memory
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, a worker at the park, stumbles across robotic hosts that are monitored and directed solely by the park’s director, Dr. Ford. Ford reveals that these robots are an enactment of nostalgic memories from his younger days that he wishes to preserve, especially since they were built by an old friend of his. Relates to digital memory and delaying forgetting.
How can advanced technology and robotics be used to preserve human memories? Should we be able to re-create the past using such software? How do we value humans and memories if they can be easily replaced with robotic or digital versions?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
- Kinolab
- 2016
Maeve Part II: Robot Consciousness and Parameters of Robotic Life
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
What should the relationship be like between advanced AI and their human creators? Can advanced AI be considered independent agents? Are human thoughts any more abstract or improvised than the visualisation of Maeve’s memories? What is the fundamental difference between being born and being made? Should AI/robots be able to ‘know’ about their own creation and existence? Should robots have the ability to “live without limits” like humans can, and do they even have the capability to be programmed in such a way?
-
- 9 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
- Kinolab
- 2016
Humanity and Consciousness of Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
Can humans and AI develop significant, intimate relationships? If AI can learn the rules of human interaction indistinguishably well, does it matter if we can’t tell the difference between AI and human? What does it mean to have ‘consciousness’? Is programmed consciousness significantly different than biological consciousness? Is it morally acceptable to create humanoid robots with no choice for consciousness all in the name of making humans feel more powerful?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
- Kinolab
- 2016
Maeve Part I: Sex, Sentience, and Subservience of Humanoid Entertainment Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
Could AI develop to have a subconscious, or should they be limited to only doing what humans instruct them to do? What if robots did gain a sense or a repulsion to the way in which humans treat them? How does the nature of ‘sex’ change when robots are used instead of humans? What are the ethical issues of creating a humanoid robot who is designed to give consent regardless of treatment? Can AI contemplate their own mortality? At that point, do they become ‘living beings’? If robots are only programmed for certain scenarios or certain personalities, what happens if they break free of that false reality?
-
- 15 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. These robots then play out scripted “narratives” day after day with the guests of the park with memory being erased on each cycle, allowing the customers to fulfill any dark desire they wish with the hosts. After the new update to the androids, one modeled after a sheriff malfunctions in front of customers while playing out his narrative, causing a debate in the lab over whether making sure the robots are alright and functional is worth the inconvenience to the guests. Lee and Theresa, two workers from the lab, eventually discuss whether or not the updates should continue to make the robots more and more human. Lee is especially skeptical of how the profit motive tends to drive the innovation further and further toward the incorporation of completely human robots into the fantasies of high payers. When the lab team inspects the decommissioned host robot Peter Abernathy, he displays a deep and uncharacteristic concern for his daughter Dolores before going off script and speaking of his contempt toward his creators. This is then dismissed by the lab team as parts of his old programming resurfacing in a combination that made the emotions seem more realistic.
- Kinolab
- 2016
Repetitious Robots and Programming for Human Pleasure
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. These robots then play out scripted “narratives” day after day with the guests of the park with memory being erased on each cycle, allowing the customers to fulfill any dark desire they wish with the hosts. After the new update to the androids, one modeled after a sheriff malfunctions in front of customers while playing out his narrative, causing a debate in the lab over whether making sure the robots are alright and functional is worth the inconvenience to the guests. Lee and Theresa, two workers from the lab, eventually discuss whether or not the updates should continue to make the robots more and more human. Lee is especially skeptical of how the profit motive tends to drive the innovation further and further toward the incorporation of completely human robots into the fantasies of high payers. When the lab team inspects the decommissioned host robot Peter Abernathy, he displays a deep and uncharacteristic concern for his daughter Dolores before going off script and speaking of his contempt toward his creators. This is then dismissed by the lab team as parts of his old programming resurfacing in a combination that made the emotions seem more realistic.
How can reality be “mixed” using AI and Robotics for recreational purposes? What might be some consequences to tailoring life-like robots solely to human desires? What are the implications of programming humanoid robots to perform repetitious tasks tirelessly without a break? How does this differ from lines of code which direct our computers to complete one task in a loop? How ‘real’ should a realistic Robot be? Can a robot being ‘too realistic’ be something that should scare or worry us? What determines the ‘realness’ of AI emotions? Is it all just programming, or is there a point at which the emotions become undistinguishable from human emotions?