Ways in which technologies may bring different type of leisure experiences to a larger audience
Technology Based Entertainment and Leisure (34)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 5 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
- Kinolab
- 2016
Expendability vs. Emotional Connection in Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
Can AI be programmed to feel like it is ‘human’? If AI can form attachments to things or people through programming, is that attachment considered “real”? What are the ethical questions involved with how humans treat advanced AI? Does human morality apply to ‘virtual’ experiences/games? Do human actions in digital realms/against digital beings reveal something about their humanity? Is it important to program consequences into digital environments, even if they come at the expense of total “freedom” for users?
-
- 12 min
- Kinolab
- 2016
Cooper, a world traveller whose father recently died of Alzheimer’s disease, is payed to play-test a virtual reality game in which a brain-computer interface will be inserted through his neck in order to place his consciousness into a horror scenario in which he is plagued by his deepest fears. After several terrifying vignettes, he begins to lose all of his memories, mirroring his ultimate concern of succumbing to Alzheimer’s like his father and continuing to ignore or forget his mother. After this, he appears to be rescued by the game’s managers, but the truth of his real-life situation is later revealed to be far more gruesome.
- Kinolab
- 2016
Personalized and Occupational Dangers of Digital Realities
Cooper, a world traveller whose father recently died of Alzheimer’s disease, is payed to play-test a virtual reality game in which a brain-computer interface will be inserted through his neck in order to place his consciousness into a horror scenario in which he is plagued by his deepest fears. After several terrifying vignettes, he begins to lose all of his memories, mirroring his ultimate concern of succumbing to Alzheimer’s like his father and continuing to ignore or forget his mother. After this, he appears to be rescued by the game’s managers, but the truth of his real-life situation is later revealed to be far more gruesome.
Is it ethical to use human subjects to test digital games or realities involving personal psychological processes? What might some alternatives be? How can the safety of subjects be ensured? How can scientists ensure brain-computer interfaces are safe before trying them out on human brains? Can this ever be done ethically? How could technology which mines ones deepest psychological fears be used or abused outside of entertainment purposes?
-
- 4 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Danny is a teenager who has become so addicted to the cacophony of entertainment coming through the Feed that he is unable to interact with people in the real world once everything in his Feed is turned off.
- Kinolab
- 2019
Digital Withdrawal
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Danny is a teenager who has become so addicted to the cacophony of entertainment coming through the Feed that he is unable to interact with people in the real world once everything in his Feed is turned off.
What are the potential consequences of getting teenagers addicted to virtual ways of interacting with content and with each other? How might this impact their ability to relate to other people in the real world? How do brain-computer interfaces which give constant, unbridled access to such entertainment and social networks exacerbate this problem? Will it become necessary in the future to “re-teach” young people how to interact offline?
-
- 8 min
- Kinolab
- 2016
In this extreme imagination of social media, detectives Karin Parke and Blue Coulson try to discover the correlation between two recent deaths. They first interrogate a teacher who posted “#DeathTo @JoPowersWriter” along with a photo of controversial journalist Jo Powers on the day before Jo was found dead. The teacher discusses the popularity of this message and the hashtag, sharing that an entire online community split the cost of sending Jo a hateful message on a cake. Later on, the detectives discover that these deaths were determined by bots and the trending of the #DeathTo, and that whichever name had the most hits under this hashtag were hunted down and killed by a mysterious force.
- Kinolab
- 2016
Social Media Trends and Hive Mind Justice
In this extreme imagination of social media, detectives Karin Parke and Blue Coulson try to discover the correlation between two recent deaths. They first interrogate a teacher who posted “#DeathTo @JoPowersWriter” along with a photo of controversial journalist Jo Powers on the day before Jo was found dead. The teacher discusses the popularity of this message and the hashtag, sharing that an entire online community split the cost of sending Jo a hateful message on a cake. Later on, the detectives discover that these deaths were determined by bots and the trending of the #DeathTo, and that whichever name had the most hits under this hashtag were hunted down and killed by a mysterious force.
How does this relate to the phenomenon of “cancel culture” in the real world? How can buzzwords commonly used online translate poorly into real life? How can digital social media be re-imagined so that users are less susceptible to “trends” started by bots? Is there a possibility that social media might give too much power or too high of a platform to the general population?
-
- 6 min
- Kinolab
- 2017
Robert Daly is a programmer at the company Callister, which developed the immersive virtual reality game Infinity and its community for the entertainment of users. Daly is typically seen in the shadow of the co-founder of the company, the charismatic James Walton. Unbeknownst to anyone else, Daly possesses a personal modification of the Infinity game program, where he is able to upload sentient digital clones of his co-workers to take out his frustrations upon, as he does with Walton in this narrative.
- Kinolab
- 2017
Virtual Vindictiveness and Simulated Clones Part I: Daly and Walton
Robert Daly is a programmer at the company Callister, which developed the immersive virtual reality game Infinity and its community for the entertainment of users. Daly is typically seen in the shadow of the co-founder of the company, the charismatic James Walton. Unbeknownst to anyone else, Daly possesses a personal modification of the Infinity game program, where he is able to upload sentient digital clones of his co-workers to take out his frustrations upon, as he does with Walton in this narrative.
What should the ethical boundaries be in terms of creating digital copies of real-life people to manipulate in virtual realities? How would this alter the perception of autonomy or entitlement? Should the capability to create exact digital likenesses of real people be created for any reason? If so, how should their autonomy be ensured, since they are technically a piece of programming? Are digital copies of a person entitled to the same rights that their corporeal selves have?
-
- 15 min
- Kinolab
- 2017
In a world in which the program Coach determines the pairing and duration of romantic matches, Frank and Amy managed to be matched more than once and eventually fall in love after failed matches with other people. After Frank breaks a promise to Amy by checking the expiry date that is automatically assigned to all relationships, they temporarily break up. After a reunion, they set out to discover the truth of their reality and the meaning of their match.
- Kinolab
- 2017
Online Dating Algorithms
In a world in which the program Coach determines the pairing and duration of romantic matches, Frank and Amy managed to be matched more than once and eventually fall in love after failed matches with other people. After Frank breaks a promise to Amy by checking the expiry date that is automatically assigned to all relationships, they temporarily break up. After a reunion, they set out to discover the truth of their reality and the meaning of their match.
Should machine learning algorithms, even the most sophisticated ones, be trusted when it comes to deeply emotional matters like love? Can simulations and algorithms account for everything when it comes to a person’s experience of love? How could algorithmic bias which is present in real-life matching programs enter the virtual reality system shown here? How can advanced simulations be distinguished from reality? Has the digital age moved the dating experience firmly past the “old days” of falling in love, and should this be embraced?