Ways in which technologies may bring different type of leisure experiences to a larger audience
Technology Based Entertainment and Leisure (33)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 12 min
- Kinolab
- 2016
Cooper, a world traveller whose father recently died of Alzheimer’s disease, is payed to play-test a virtual reality game in which a brain-computer interface will be inserted through his neck in order to place his consciousness into a horror scenario in which he is plagued by his deepest fears. After several terrifying vignettes, he begins to lose all of his memories, mirroring his ultimate concern of succumbing to Alzheimer’s like his father and continuing to ignore or forget his mother. After this, he appears to be rescued by the game’s managers, but the truth of his real-life situation is later revealed to be far more gruesome.
- Kinolab
- 2016
Personalized and Occupational Dangers of Digital Realities
Cooper, a world traveller whose father recently died of Alzheimer’s disease, is payed to play-test a virtual reality game in which a brain-computer interface will be inserted through his neck in order to place his consciousness into a horror scenario in which he is plagued by his deepest fears. After several terrifying vignettes, he begins to lose all of his memories, mirroring his ultimate concern of succumbing to Alzheimer’s like his father and continuing to ignore or forget his mother. After this, he appears to be rescued by the game’s managers, but the truth of his real-life situation is later revealed to be far more gruesome.
Is it ethical to use human subjects to test digital games or realities involving personal psychological processes? What might some alternatives be? How can the safety of subjects be ensured? How can scientists ensure brain-computer interfaces are safe before trying them out on human brains? Can this ever be done ethically? How could technology which mines ones deepest psychological fears be used or abused outside of entertainment purposes?
-
- 13 min
- Kinolab
- 2013
In this episode, Victoria wakes up with no memory of who she is in a post-apocalyptic scenario. She is chased and hunted by weapon-toting masked people, and gets no help from the bystanders who record her horrific struggle for survival on their smartphones. Eventually, it is revealed that this scenario is an engineered reality. While the digital technologies present here are limited, this narrative stands as an effective metaphor to study the phenomenon of “cancel culture” and other ways in which digital technologies alienate the humanity of others.
- Kinolab
- 2013
Fascination and Desensitization through Digital Technologies
In this episode, Victoria wakes up with no memory of who she is in a post-apocalyptic scenario. She is chased and hunted by weapon-toting masked people, and gets no help from the bystanders who record her horrific struggle for survival on their smartphones. Eventually, it is revealed that this scenario is an engineered reality. While the digital technologies present here are limited, this narrative stands as an effective metaphor to study the phenomenon of “cancel culture” and other ways in which digital technologies alienate the humanity of others.
Do smartphones and their recording capabilities make people less sensitive to events or phenomena which they capture? How do digital news channels or platforms sensationalize bad people, especially criminals or other wrong-doers, and inspire collective hatred? How can digital technologies be designed to be more empathetic? Why is it so easy to criticize others over digital channels?
-
- 5 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
- Kinolab
- 2016
Expendability vs. Emotional Connection in Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
Can AI be programmed to feel like it is ‘human’? If AI can form attachments to things or people through programming, is that attachment considered “real”? What are the ethical questions involved with how humans treat advanced AI? Does human morality apply to ‘virtual’ experiences/games? Do human actions in digital realms/against digital beings reveal something about their humanity? Is it important to program consequences into digital environments, even if they come at the expense of total “freedom” for users?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
- Kinolab
- 2016
Maeve Part I: Sex, Sentience, and Subservience of Humanoid Entertainment Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. However, during one loop, she has a “flashback” to a prior memory from a time when her programming contained a different narrative role. After the Flashback glitch, Maeve is taken to the lab and reprogrammed before being returned to her role as a prostitute to fulfill the desires of the guests of the park. During this re-programming, it is revealed that robots can conceptualise dreams and nightmares, cobbled together of old memories. During a maintenance check, Maeve is accidentally left on, and escapes the operating room to discover the lab outside of Westworld and the other robots.
Could AI develop to have a subconscious, or should they be limited to only doing what humans instruct them to do? What if robots did gain a sense or a repulsion to the way in which humans treat them? How does the nature of ‘sex’ change when robots are used instead of humans? What are the ethical issues of creating a humanoid robot who is designed to give consent regardless of treatment? Can AI contemplate their own mortality? At that point, do they become ‘living beings’? If robots are only programmed for certain scenarios or certain personalities, what happens if they break free of that false reality?
-
- 15 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. These robots then play out scripted “narratives” day after day with the guests of the park with memory being erased on each cycle, allowing the customers to fulfill any dark desire they wish with the hosts. After the new update to the androids, one modeled after a sheriff malfunctions in front of customers while playing out his narrative, causing a debate in the lab over whether making sure the robots are alright and functional is worth the inconvenience to the guests. Lee and Theresa, two workers from the lab, eventually discuss whether or not the updates should continue to make the robots more and more human. Lee is especially skeptical of how the profit motive tends to drive the innovation further and further toward the incorporation of completely human robots into the fantasies of high payers. When the lab team inspects the decommissioned host robot Peter Abernathy, he displays a deep and uncharacteristic concern for his daughter Dolores before going off script and speaking of his contempt toward his creators. This is then dismissed by the lab team as parts of his old programming resurfacing in a combination that made the emotions seem more realistic.
- Kinolab
- 2016
Repetitious Robots and Programming for Human Pleasure
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. These robots then play out scripted “narratives” day after day with the guests of the park with memory being erased on each cycle, allowing the customers to fulfill any dark desire they wish with the hosts. After the new update to the androids, one modeled after a sheriff malfunctions in front of customers while playing out his narrative, causing a debate in the lab over whether making sure the robots are alright and functional is worth the inconvenience to the guests. Lee and Theresa, two workers from the lab, eventually discuss whether or not the updates should continue to make the robots more and more human. Lee is especially skeptical of how the profit motive tends to drive the innovation further and further toward the incorporation of completely human robots into the fantasies of high payers. When the lab team inspects the decommissioned host robot Peter Abernathy, he displays a deep and uncharacteristic concern for his daughter Dolores before going off script and speaking of his contempt toward his creators. This is then dismissed by the lab team as parts of his old programming resurfacing in a combination that made the emotions seem more realistic.
How can reality be “mixed” using AI and Robotics for recreational purposes? What might be some consequences to tailoring life-like robots solely to human desires? What are the implications of programming humanoid robots to perform repetitious tasks tirelessly without a break? How does this differ from lines of code which direct our computers to complete one task in a loop? How ‘real’ should a realistic Robot be? Can a robot being ‘too realistic’ be something that should scare or worry us? What determines the ‘realness’ of AI emotions? Is it all just programming, or is there a point at which the emotions become undistinguishable from human emotions?
-
- 4 min
- Kinolab
- 2018
Wade Watts lives in an imagined future in which the OASIS, a limitless virtual reality world, acts as a constant distraction from the real world for the majority of citizens. In this scene, the virtual avatars of himself and his team search for a McGuffin item in the digitally rendered atmosphere of the film The Shining, with features of the film such as the twin girls making an appearance.
- Kinolab
- 2018
Digitizing the Fictional and Copyright Claims
Wade Watts lives in an imagined future in which the OASIS, a limitless virtual reality world, acts as a constant distraction from the real world for the majority of citizens. In this scene, the virtual avatars of himself and his team search for a McGuffin item in the digitally rendered atmosphere of the film The Shining, with features of the film such as the twin girls making an appearance.
How can pre-existing media, like film, be made “new” through deep fake and VR tech? What are the copyright issues that arise? If digital technology can render fictional settings accessible to everyone, who truly “owns” a space or story? Do narrative artists lose creative power if people can digitally insert themselves into the worlds of a story?