Film Clip (143)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 4 min
- Kinolab
- 2018
Wade Watts lives in an imagined future in which the OASIS, a limitless virtual reality world, acts as a constant distraction from the real world for the majority of citizens. In this scene, the virtual avatars of himself and his team search for a McGuffin item in the digitally rendered atmosphere of the film The Shining, with features of the film such as the twin girls making an appearance.
- Kinolab
- 2018
Digitizing the Fictional and Copyright Claims
Wade Watts lives in an imagined future in which the OASIS, a limitless virtual reality world, acts as a constant distraction from the real world for the majority of citizens. In this scene, the virtual avatars of himself and his team search for a McGuffin item in the digitally rendered atmosphere of the film The Shining, with features of the film such as the twin girls making an appearance.
How can pre-existing media, like film, be made “new” through deep fake and VR tech? What are the copyright issues that arise? If digital technology can render fictional settings accessible to everyone, who truly “owns” a space or story? Do narrative artists lose creative power if people can digitally insert themselves into the worlds of a story?
-
- 15 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. These robots then play out scripted “narratives” day after day with the guests of the park with memory being erased on each cycle, allowing the customers to fulfill any dark desire they wish with the hosts. After the new update to the androids, one modeled after a sheriff malfunctions in front of customers while playing out his narrative, causing a debate in the lab over whether making sure the robots are alright and functional is worth the inconvenience to the guests. Lee and Theresa, two workers from the lab, eventually discuss whether or not the updates should continue to make the robots more and more human. Lee is especially skeptical of how the profit motive tends to drive the innovation further and further toward the incorporation of completely human robots into the fantasies of high payers. When the lab team inspects the decommissioned host robot Peter Abernathy, he displays a deep and uncharacteristic concern for his daughter Dolores before going off script and speaking of his contempt toward his creators. This is then dismissed by the lab team as parts of his old programming resurfacing in a combination that made the emotions seem more realistic.
- Kinolab
- 2016
Repetitious Robots and Programming for Human Pleasure
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. These robots then play out scripted “narratives” day after day with the guests of the park with memory being erased on each cycle, allowing the customers to fulfill any dark desire they wish with the hosts. After the new update to the androids, one modeled after a sheriff malfunctions in front of customers while playing out his narrative, causing a debate in the lab over whether making sure the robots are alright and functional is worth the inconvenience to the guests. Lee and Theresa, two workers from the lab, eventually discuss whether or not the updates should continue to make the robots more and more human. Lee is especially skeptical of how the profit motive tends to drive the innovation further and further toward the incorporation of completely human robots into the fantasies of high payers. When the lab team inspects the decommissioned host robot Peter Abernathy, he displays a deep and uncharacteristic concern for his daughter Dolores before going off script and speaking of his contempt toward his creators. This is then dismissed by the lab team as parts of his old programming resurfacing in a combination that made the emotions seem more realistic.
How can reality be “mixed” using AI and Robotics for recreational purposes? What might be some consequences to tailoring life-like robots solely to human desires? What are the implications of programming humanoid robots to perform repetitious tasks tirelessly without a break? How does this differ from lines of code which direct our computers to complete one task in a loop? How ‘real’ should a realistic Robot be? Can a robot being ‘too realistic’ be something that should scare or worry us? What determines the ‘realness’ of AI emotions? Is it all just programming, or is there a point at which the emotions become undistinguishable from human emotions?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
- Kinolab
- 2016
Maeve Part I: Sex, Sentience, and Subservience of Humanoid Entertainment Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
Could AI develop to have a subconscious, or should they be limited to only doing what humans instruct them to do? What if robots did gain a sense or a repulsion to the way in which humans treat them? How does the nature of ‘sex’ change when robots are used instead of humans? What are the ethical issues of creating a humanoid robot who is designed to give consent regardless of treatment? Can AI contemplate their own mortality? At that point, do they become ‘living beings’? If robots are only programmed for certain scenarios or certain personalities, what happens if they break free of that false reality?
-
- 9 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
- Kinolab
- 2016
Humanity and Consciousness of Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, an engineer at Westworld, performs a diagnostic test on the robotic host Dolores, in which she analyses a passage from Alice in Wonderland before asking Bernard about his son. Later, the park director, Dr. Ford, tells Bernard of his former partner Arnold, who wished to program consciousness into the robots, essentially using their code as a means to spur their own thoughts. This was ultimately decided against so that the hosts could perform their narratives for the service of the guests and their desires.
Can humans and AI develop significant, intimate relationships? If AI can learn the rules of human interaction indistinguishably well, does it matter if we can’t tell the difference between AI and human? What does it mean to have ‘consciousness’? Is programmed consciousness significantly different than biological consciousness? Is it morally acceptable to create humanoid robots with no choice for consciousness all in the name of making humans feel more powerful?
-
- 8 min
- Kinolab
- 2016
In a world in which social media is constantly visible, and in which the averaged five star rating for each person based on every single one of their interactions with others are displayed, Lacie tries to move into the higher echelons of society. She does this by consistently keeping up saccharine appearances in real life and on her social media feed because everyone is constantly connected to this technology. Once she is spurred to up her rating, Lacie gets an invite to a high-profile wedding. However, after a few unfortunate events leave her seeming less desirable to others, thus lowering her rating, she finds her world far less accessible and kind. For further reading and real-life connections, see the narrative “Inside China’s Vast New Experiment in Social Ranking.”
- Kinolab
- 2016
Lacie Part I: Translating Online Interactions and Social Quantification
In a world in which social media is constantly visible, and in which the averaged five star rating for each person based on every single one of their interactions with others are displayed, Lacie tries to move into the higher echelons of society. She does this by consistently keeping up saccharine appearances in real life and on her social media feed because everyone is constantly connected to this technology. Once she is spurred to up her rating, Lacie gets an invite to a high-profile wedding. However, after a few unfortunate events leave her seeming less desirable to others, thus lowering her rating, she finds her world far less accessible and kind. For further reading and real-life connections, see the narrative “Inside China’s Vast New Experiment in Social Ranking.”
How do digital platforms promote inauthenticity? Why do appearances matter more in the digital age? Can digital technologies ever truly perfectly mirror an in-person interaction? Do the shallower ways in which people communicate online translate well into the real world? How could digital social platforms do better at promoting longer connection instead of the instant gratification of likes or ratings? Should social media platforms be so focused on quantifying interactions, in terms of likes or comments or followers? How can this quantification be de-emphasized?
-
- 10 min
- Kinolab
- 2016
In this episode, Kenny’s life is upended after hackers use malware to access a compromising video of Kenny on his laptop. Under the threat of this humiliating video being sent to everyone in his contacts, Kenny becomes a puppet of the hackers, forced to have his location services on and be tracked and contacted through his smartphone wherever he goes. Along with other puppets of the hackers, including a man named Hector who had an affair, he is forced to commit heinous acts such as a bank robbery and a fight to the death. Despite their compliance, the hackers release the puppets’ information anyway, leading to vast consequences in their personal lives.
- Kinolab
- 2016
Cyber Blackmailing and Compromising Data
In this episode, Kenny’s life is upended after hackers use malware to access a compromising video of Kenny on his laptop. Under the threat of this humiliating video being sent to everyone in his contacts, Kenny becomes a puppet of the hackers, forced to have his location services on and be tracked and contacted through his smartphone wherever he goes. Along with other puppets of the hackers, including a man named Hector who had an affair, he is forced to commit heinous acts such as a bank robbery and a fight to the death. Despite their compliance, the hackers release the puppets’ information anyway, leading to vast consequences in their personal lives.
Is anyone truly “alone” or “unwatched” when in the presence of their mobile computing devices? Whose responsibility is it to guard people against the dangers witnessed in this narrative? Do digital technologies need clearer and more thorough warnings about the possibilities of malware infecting a device? How can mobile computing devices and location tracking be manipulated to deprive people of autonomy? Are small individual steps such as covering up cameras enough to guard against these types of problems?