Bioinformatics (86)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 4 min
- Kinolab
- 1995
In this world, a human consciousness (“ghost”) can inhabit an artificial body (“shell”), thus at once becoming edited humans in a somewhat robotic body. Major, a security officer, sees how a garbage man is sad to know that his ghost has been hacked and filled with false memories of a family, and dives to set up her own reflections with self-identity developed later in the film, especially as she starts to believe that she may be entirely a cyborg with no knowledge of such an existence. Essentially, because the human body has become so thoroughly and regularly augmented with cybernetic parts and even computer brains, defining a real “human” becomes harder and harder.
- Kinolab
- 1995
Identity Through Memory and Data
In this world, a human consciousness (“ghost”) can inhabit an artificial body (“shell”), thus at once becoming edited humans in a somewhat robotic body. Major, a security officer, sees how a garbage man is sad to know that his ghost has been hacked and filled with false memories of a family, and dives to set up her own reflections with self-identity developed later in the film, especially as she starts to believe that she may be entirely a cyborg with no knowledge of such an existence. Essentially, because the human body has become so thoroughly and regularly augmented with cybernetic parts and even computer brains, defining a real “human” becomes harder and harder.
If robots develop to the point where they can question their own existence as human, does the line between robot and human truly matter? For what reason? Is questioning human existence a fundamentally human trait? Can fake memories contribute to an identity as much as real ones? Is this a dangerous concept, or might it have positive utility? Do you agree with the assessment that “all data is just fantasy,” or an inaccurate abstraction of real life? What kinds of data, then, make up the human identity?
-
- 15 min
- Kinolab
- 1993
Dinosaurs are an extinct species that are revived and brought into the modern day in Jurassic Park. This is accomplished through a cloning process involving extracting dinosaur DNA from mosquitos preserved in amber, and using computational genomics to create replicants with certain properties, such as breeding only female dinosaurs. Three scientists are sent to audit the park, and all three find problems inherent with the use of technology in attempts to control life itself. Eventually, the park’s founder, John Hammond, admits that his idea to create entertainment out of this dangerous technological revival was a failure, which is seen in action during the subsequent dinosaur attack.
- Kinolab
- 1993
Technological Revival of the Past
Dinosaurs are an extinct species that are revived and brought into the modern day in Jurassic Park. This is accomplished through a cloning process involving extracting dinosaur DNA from mosquitos preserved in amber, and using computational genomics to create replicants with certain properties, such as breeding only female dinosaurs. Three scientists are sent to audit the park, and all three find problems inherent with the use of technology in attempts to control life itself. Eventually, the park’s founder, John Hammond, admits that his idea to create entertainment out of this dangerous technological revival was a failure, which is seen in action during the subsequent dinosaur attack.
Is using computational genomics to alter the course of nature and natural selection itself inherently wrong? Are there contexts where this may be helpful or necessary? How should technology be used to tell the story of the past, and what limits should exist in this prospect? How can technological idealists like John Hammond be checked before their innovations lead to disaster?
-
- 14 min
- Kinolab
- 1973
On a faraway planet, kidnapped humans under the name of Oms live as an inferior race to the Draggs, giant blue aliens that either keep the Oms as pets or banish them to the wilds to be consumed by extraterrestrial monsters. One of these Oms, Terr, is the pet of Tiwa, and begins to acquire an education through a malfunction of Tiwa’s brain-computer interface, which beams knowledge directly into her head. Terr eventually uses this cutting edge technology to which Oms do not usually have access to spread knowledge to other Oms and begin a revolt.
- Kinolab
- 1973
Technology and Educational Inequalities
On a faraway planet, kidnapped humans under the name of Oms live as an inferior race to the Draggs, giant blue aliens that either keep the Oms as pets or banish them to the wilds to be consumed by extraterrestrial monsters. One of these Oms, Terr, is the pet of Tiwa, and begins to acquire an education through a malfunction of Tiwa’s brain-computer interface, which beams knowledge directly into her head. Terr eventually uses this cutting edge technology to which Oms do not usually have access to spread knowledge to other Oms and begin a revolt.
How can access to technology determine the quality of education that a certain person or group receives? How are people with less technological access or fluency somewhat at the mercy of those with more? How can educational technologies be made more equitable?
-
- 7 min
- Kinolab
- 2008
Under threat of eviction, Luz must find a quick way to make some money to pay rent. Thankfully, through the company TruNode, she can digitize her memories and sell them on the internet for anyone who may wish to access and stream them. While this seems convenient, the downsides are shown when the repository of her memories are used to help ruthless drone pilot Rudy Ramirez hunt down an innocent laborer who is a supposedly dangerous criminal. After Luz reveals this means of making money to Memo, the aforementioned innocent laborer, he is less than enthused with the system.
- Kinolab
- 2008
Selling Digitized Memories
Under threat of eviction, Luz must find a quick way to make some money to pay rent. Thankfully, through the company TruNode, she can digitize her memories and sell them on the internet for anyone who may wish to access and stream them. While this seems convenient, the downsides are shown when the repository of her memories are used to help ruthless drone pilot Rudy Ramirez hunt down an innocent laborer who is a supposedly dangerous criminal. After Luz reveals this means of making money to Memo, the aforementioned innocent laborer, he is less than enthused with the system.
How can the high cost of very personal data and digital memories be both empowering in the right circumstances and disempowering in the wrong ones? What if people were able to sell all of their personal data, as is shown here? Is the complete digitization of memory a positive concept or a negative one? How can data or memory be purchased for nefarious purposes? How can people be unintentionally harmed by this system? Can the emotions of memories ever be paired well with a digital interface?
-
- 51 min
- TechCrunch
- 2020
In this podcast, several disability experts discuss the evolving relationship between disabled people, society, and technology. The main point of discussion is the difference between the medical and societal models of disability, and how the medical lens tends to spur technologies with an individual focus on remedying disability, whereas the societal lens could spur technologies that lead to a more accessible world. Artificial Intelligence and machine learning is labelled as inherently “normative” since it is trained on data that comes from a biased society, and therefore is less likely to work in favor of a social group as varied as disabled people. There is a clear need for institutional change in the technology industry to address these problems.
- TechCrunch
- 2020
Artificial Intelligence and Disability
In this podcast, several disability experts discuss the evolving relationship between disabled people, society, and technology. The main point of discussion is the difference between the medical and societal models of disability, and how the medical lens tends to spur technologies with an individual focus on remedying disability, whereas the societal lens could spur technologies that lead to a more accessible world. Artificial Intelligence and machine learning is labelled as inherently “normative” since it is trained on data that comes from a biased society, and therefore is less likely to work in favor of a social group as varied as disabled people. There is a clear need for institutional change in the technology industry to address these problems.
What are some problems with injecting even the most unbiased of technologies into a system biased against certain groups, including disabled people? How can developers aim to create technology which can actually put accessibility before profit? How can it be ensured that AI algorithms take into account more than just normative considerations? How can developers be forced to consider the myriad impacts that one technology may have on large heterogeneous communities such as the disabled community?
-
- 16 min
- Kinolab
- 2004
Joel Barish recently broke up with Clementine, his girlfriend of two years, in a brutal argument. After discovering that she has used a procedure known as Lacuna to erase him from her memories, Joel decides to undergo the same procedure to forget that he ever knew Clementine. The procedure uses a brain-computer interface to map the areas of Joel’s brain that are active whenever he has a memory of Clementine, first when he is awake and using associated objects to perform active recall and then when he is asleep and subconsciously remembering her. Despite Joel’s eventual regrets and desperate attempts to remember Clementine, the procedure is successful, and he forgets her. However, Joel and Clementine reunite in the real world after their respective procedures, and as they have a fresh start, they end up listening to Clementine’s tape from before the procedure where she dissects all of the flaws of Joel and their relationship.
- Kinolab
- 2004
Digital Memory Erasure and Brain Mapping
Joel Barish recently broke up with Clementine, his girlfriend of two years, in a brutal argument. After discovering that she has used a procedure known as Lacuna to erase him from her memories, Joel decides to undergo the same procedure to forget that he ever knew Clementine. The procedure uses a brain-computer interface to map the areas of Joel’s brain that are active whenever he has a memory of Clementine, first when he is awake and using associated objects to perform active recall and then when he is asleep and subconsciously remembering her. Despite Joel’s eventual regrets and desperate attempts to remember Clementine, the procedure is successful, and he forgets her. However, Joel and Clementine reunite in the real world after their respective procedures, and as they have a fresh start, they end up listening to Clementine’s tape from before the procedure where she dissects all of the flaws of Joel and their relationship.
Is it possible to completely forget and event or a person in the digital age, or is there always the possibility that traces will remain? Do digital technologies hold memories well enough, or is there something more abstract about these memories that they cannot capture? How could the technology displayed here be abused? Does pervasive digital memory of people and events ever allow us to feel completely neutral about another person, and is the a departure from the pre-digital age? Do humans have an over-reliance on digital memory? How have relationships changed with the advent of digital memory?