Promotion of Human Values (142)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 5 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
- Kinolab
- 2016
Expendability vs. Emotional Connection in Humanoid Robots
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. One of the robotic hosts, Dolores, recounts to the human guest William a memory of old times with her father and her attachment to lost cattle throughout the years. Directly afterward, she has a flashback to a former iteration of herself, which was killed in another narrative before being restored by the lab team. Later on in the narrative which William and his sadistic future brother-in-law Logan follow, Logan reveals his darker nature by shooting one of the robots and telling Dolores that she is a robot, a choice which disgusts William. Logan argues that his actions do not morally matter because this is a fake world full of fake people.
Can AI be programmed to feel like it is ‘human’? If AI can form attachments to things or people through programming, is that attachment considered “real”? What are the ethical questions involved with how humans treat advanced AI? Does human morality apply to ‘virtual’ experiences/games? Do human actions in digital realms/against digital beings reveal something about their humanity? Is it important to program consequences into digital environments, even if they come at the expense of total “freedom” for users?
-
- 4 min
- Kinolab
- 2016
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, a worker at the park, stumbles across robotic hosts that are monitored and directed solely by the park’s director, Dr. Ford. Ford reveals that these robots are an enactment of nostalgic memories from his younger days that he wishes to preserve, especially since they were built by an old friend of his. Relates to digital memory and delaying forgetting.
- Kinolab
- 2016
Technology and the Tangibility of Human Memory
Westworld, a western-themed amusement park, is populated by realistic robotic creatures known as “hosts” that are designed in a lab and constantly updated to seem as real and organic as possible. Bernard, a worker at the park, stumbles across robotic hosts that are monitored and directed solely by the park’s director, Dr. Ford. Ford reveals that these robots are an enactment of nostalgic memories from his younger days that he wishes to preserve, especially since they were built by an old friend of his. Relates to digital memory and delaying forgetting.
How can advanced technology and robotics be used to preserve human memories? Should we be able to re-create the past using such software? How do we value humans and memories if they can be easily replaced with robotic or digital versions?
-
- 15 min
- Kinolab
- 1997
Vincent, an “invalid” from the proximate future, was born naturally and is therefore seen as less than humans such as his brother Anton who were conceived in a computational genetic selection process to ensure that the best traits are carried on. Thus, in this eugenic society, biometric technologies such as finger pricks and other scans are used to detect superior and inferior human genomes. Vincent, relinquished to service jobs, steals the identity of a genetically superior man named Eugene to fulfill his goal of going on a space mission, yet is never able to let go of his sibling rivalry. However, Vincent sets himself up to prove that humans edited through this computational genomics project are not automatically superior to those naturally born.
- Kinolab
- 1997
Computational Genomics, Unnatural Selection, and Privilege
Vincent, an “invalid” from the proximate future, was born naturally and is therefore seen as less than humans such as his brother Anton who were conceived in a computational genetic selection process to ensure that the best traits are carried on. Thus, in this eugenic society, biometric technologies such as finger pricks and other scans are used to detect superior and inferior human genomes. Vincent, relinquished to service jobs, steals the identity of a genetically superior man named Eugene to fulfill his goal of going on a space mission, yet is never able to let go of his sibling rivalry. However, Vincent sets himself up to prove that humans edited through this computational genomics project are not automatically superior to those naturally born.
How might Computational Genomics affect the construction of the workforce? Is it ethical to discriminate upon the quality of ones genetic profile? Should the power of computers be used to help families partake in genetic selection for their children? How could bias enter supposedly beneficial uses of computational genomics? Have we gotten past a point where it is possible to fool computers with fake genetic tricks as Vincent does here?
-
- 5 min
- Science Alert
- 2019
A newly revealed patent application filed by Amazon raises privacy concerns over an upgrade to the virtual assistant Alexa, in which the system records everything a users says in 10-30 second bits to look out for a command.
- Science Alert
- 2019
-
- 5 min
- Science Alert
- 2019
Newly Released Amazon Patent Shows Just How Much Creepier Alexa Can Get
A newly revealed patent application filed by Amazon raises privacy concerns over an upgrade to the virtual assistant Alexa, in which the system records everything a users says in 10-30 second bits to look out for a command.
Is having a virtual assistant constantly listening “Big Brother-like” behavior? Can Amazon and other companies be trusted to not abuse this data? How much transparency would be needed to make this acceptable?
-
- 5 min
- Wired
- 2019
Non-profit companies such as Thorn and the Canadian Centre for Child Protection are using existing software, particularly facial recognition algorithms, to discover ways to become more proactive in fighting child pornography and human trafficking on the dark web.
- Wired
- 2019
-
- 5 min
- Wired
- 2019
How Facial Recognition is fighting child sex trafficking
Non-profit companies such as Thorn and the Canadian Centre for Child Protection are using existing software, particularly facial recognition algorithms, to discover ways to become more proactive in fighting child pornography and human trafficking on the dark web.
How has technology facilitated underground illegal activities, such as child trafficking? How has technology also facilitated fighting back against them? What is your opinion on the debate on whether or not law enforcement should have extensive access to facial recognition technology or machine learning algorithms?
-
- 7 min
- Kinolab
- 2013
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Through the chat bot, Ash essentially goes on living, as he is able to respond to Martha and grow as more memories are shared with the program.
- Kinolab
- 2013
Martha and Ash Part I: Digital Revival and Human Likeness in Software
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Through the chat bot, Ash essentially goes on living, as he is able to respond to Martha and grow as more memories are shared with the program.
How should programs like this be deployed? Who should be in charge of them? Do our online interactions abstract our entire personality? Could this be validly used for therapy purposes, or is any existence of such software dangerous? Is it ethical to provide such a tangible way of disconnecting from reality, and are these interactions truly all that different from something like social media interactions?