Film Clip (143)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 12 min
- Kinolab
- 2011
In this episode, Bing Madsen is one of many citizens who provide power to the digital world through spending each day on a stationery bike, which earns him “merits” to spend on both leisure activities and necessities. These laborers, along with all other classes, are constantly surrounded by screens in which their digital avatars can participate in virtual activities like biking on a road or being in a “live” studio audience. The reality competition show “Hot Shot” is one program streamed on these screens. In this narrative, Bing’s friend Abi auditions for the show as a singer, but is instead coerced by the mass audience into signing as a pornstar for one of the judge’s companies.
- Kinolab
- 2011
Digital Media and the Commodification of Women’s Bodies
In this episode, Bing Madsen is one of many citizens who provide power to the digital world through spending each day on a stationery bike, which earns him “merits” to spend on both leisure activities and necessities. These laborers, along with all other classes, are constantly surrounded by screens in which their digital avatars can participate in virtual activities like biking on a road or being in a “live” studio audience. The reality competition show “Hot Shot” is one program streamed on these screens. In this narrative, Bing’s friend Abi auditions for the show as a singer, but is instead coerced by the mass audience into signing as a pornstar for one of the judge’s companies.
In what ways have digital technologies made sexual harassment worse, and how can this be solved? How does being digitally separated from others make people less empathetic and push them to engage in mob mentality? How do digital technologies like social networks sometimes deprive people of autonomy over their bodies? How can users of digital technologies ensure that any sexually explicit content they produce remains in their ownership and control? What problems arise from the pornography industry experiencing such growth in the digital age?
-
- 12 min
- Kinolab
- 2011
In the 2050s, humans are able to connect their brains to an implanted digital device known as a “grain,” which stores all of their individual audiovisual memories and allows for instant replays or closer analysis of any stored memories. Liam Foxwell, one such user, discusses these devices with some friends at dinner, and later uses the data collected at this party to scrutinize his wife’s interactions with Jonas, a crude man who uses the grain for contemptible purposes. With these memories, he confronts his wife and demands objective truth from her.
- Kinolab
- 2011
Digital Memory, Stored Interactions, and the Inability to Forget
In the 2050s, humans are able to connect their brains to an implanted digital device known as a “grain,” which stores all of their individual audiovisual memories and allows for instant replays or closer analysis of any stored memories. Liam Foxwell, one such user, discusses these devices with some friends at dinner, and later uses the data collected at this party to scrutinize his wife’s interactions with Jonas, a crude man who uses the grain for contemptible purposes. With these memories, he confronts his wife and demands objective truth from her.
What are the consequences of combining the fallibility of human memory with the precision of digital technology? How does over-analysis of digitally stored memories or interactions lead to anxiety or conflict in the real world? What are the dangers of placing our personal memories into a context where they can be stolen or hacked or sold? In the digital age, is anyone truly able to forget anything? How is human judgement and agency impacted by digital memory?
-
- 9 min
- Kinolab
- 2013
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
- Kinolab
- 2013
Martha and Ash Part II: Digital Revival and Human Likeness in Hardware
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
How can memories be kept pure when robots are able to impersonate deceased loved ones? If programs and robots such as this can be created, do we truly own our own existence? How can artificial intelligence fail as therapy or companionship? Can artificial intelligence and robotics help comfort people who never even met the deceased? How should an artificial companion be handled by its administrator? Can an animated or robotic humanoid likeness of a person who seemingly has feelings be relegated to the attic as easily as other mementos can?
-
- 8 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
- Kinolab
- 2016
Maeve Part II: Robot Consciousness and Parameters of Robotic Life
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of these hosts, Maeve, is programmed to be a prostitute who runs the same narrative every single day with the same personality. After several incidences of becoming conscious of her previous iterations, Maeve is told by Lutz, a worker in the Westworld lab, that she is a robot whose design and thoughts are mostly determined by humans, despite the fact that she feels and appears similar to humans such as Lutz. Once Lutz restores Maeve, she asks to be shown the “upstairs” where the robots are created to follow certain roles in the false reality of Westworld to immerse the real human guests. After seeing a trailer for the park, she begins to question the authenticity of her life. For more context, see the Maeve Part I narrative.
What should the relationship be like between advanced AI and their human creators? Can advanced AI be considered independent agents? Are human thoughts any more abstract or improvised than the visualisation of Maeve’s memories? What is the fundamental difference between being born and being made? Should AI/robots be able to ‘know’ about their own creation and existence? Should robots have the ability to “live without limits” like humans can, and do they even have the capability to be programmed in such a way?
-
- 11 min
- Kinolab
- 2013
A CGI bear named Waldo is created using computer technology which sees the facial expressions of a comedian and renders it in real-time onto a screen. He is able to insult politicians with little retribution perhaps in part because he does not appear human. This power is harnessed by executives to put up Waldo as a candidate in a political race, where he is able to take part in a debate with real people and does not seem beholden to the same standards. Eventually, Waldo’s “driver” Jamie reveals his own identity, but Waldo continues on as a figure through embodying the voice of another worker in the company.
- Kinolab
- 2013
Politics and Digital Mouthpieces
A CGI bear named Waldo is created using computer technology which sees the facial expressions of a comedian and renders it in real-time onto a screen. He is able to insult politicians with little retribution perhaps in part because he does not appear human. This power is harnessed by executives to put up Waldo as a candidate in a political race, where he is able to take part in a debate with real people and does not seem beholden to the same standards. Eventually, Waldo’s “driver” Jamie reveals his own identity, but Waldo continues on as a figure through embodying the voice of another worker in the company.
How do digital media, specifically social media platforms, allow critical and political voices to hide behind some wall of anonymity? Can digital abstractions of real political figures be considered to actually fully embody the person or candidate themself, especially when other staffers usually run their accounts? How do digital platforms change the nature of relationships between politicians and citizens in terms of direct communication? Does hiding behind digital platforms make it easier for anyone to make bold claims and statements?
-
- 5 min
- Kinolab
- 2014
In this vignette, Matt describes his backstory as a member of an online community who used technology called “Z-eyes” to walk each other through activities such as flirting with women at bars. The Z-eyes technology directly streams all audiovisual data which his friend Harry experiences to his screen, and Matt is additionally able to use facial recognition and information searches to offer background information which enhances Harry’s plays.
- Kinolab
- 2014
Vicarious Digital Living
In this vignette, Matt describes his backstory as a member of an online community who used technology called “Z-eyes” to walk each other through activities such as flirting with women at bars. The Z-eyes technology directly streams all audiovisual data which his friend Harry experiences to his screen, and Matt is additionally able to use facial recognition and information searches to offer background information which enhances Harry’s plays.
What are some problems with technology such as this being invisible, in terms of privacy? Could this have legitimate therapeutic purposes, such as being a treatment for social anxiety? How should technology like Z-eyes be regulated?