Machine Learning (83)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 5 min
- CNN
- 2010
Algorithms and machines can struggle with facial recognition, and need ideal source images to perform it consistently. However, its potential use in monitoring and identifying citizens is concerning.
- CNN
- 2010
-
- 5 min
- CNN
- 2010
Why face recognition isn’t scary — yet
Algorithms and machines can struggle with facial recognition, and need ideal source images to perform it consistently. However, its potential use in monitoring and identifying citizens is concerning.
How have the worries regarding facial recognition changed since 2010? Can we teach machines to identify human faces? How can facial recognition pose a danger/worry when use for governmental purposes?
-
- 5 min
- The New York Times
- 2019
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
- The New York Times
- 2019
-
- 5 min
- The New York Times
- 2019
How Biometrics Makes You Safer
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
How does facial recognition technology facilitate challenging investigations? Do you believe police use of facial recognition is as transparent and pure as this article makes it seem? Where could bias enter this system of using facial recognition technology?
-
- 10 min
- Survival of the Best Fit
- 2018
Explores hiring bias of AI by playing a game in which you are the hiring manager.
- Survival of the Best Fit
- 2018
-
- 10 min
- Survival of the Best Fit
- 2018
Survival of the Best Fit
Explores hiring bias of AI by playing a game in which you are the hiring manager.
How does it feel to be in the situation in which you have inserted the bias into the algorithm? What steps do you feel must be taken to ensure algorithms are trained in a less hasty manner?
-
- 13 min
- Kinolab
- 2020
George Almore is an engineer working with a company which hopes to achieve singularity with robots, making their artificial intelligence one step above real humans. In doing this, he works with three prototypes: J1, J2, and J3, each one more advanced than the last. Simultaneously, he plans to upload his dead wife’s consciousness into the J3 robot in order to extend her life. The narrative begins with him explaining his goal to J3 as he has this robot go through taste and emotion tests. Eventually, J3 has evolved into a humanoid robot who takes on the traits of George’s wife, leaving the earlier two versions, who all have a sibling-like bond with each other, feeling neglected.
- Kinolab
- 2020
Prototypes, Evolution, and Replacement with Robots
George Almore is an engineer working with a company which hopes to achieve singularity with robots, making their artificial intelligence one step above real humans. In doing this, he works with three prototypes: J1, J2, and J3, each one more advanced than the last. Simultaneously, he plans to upload his dead wife’s consciousness into the J3 robot in order to extend her life. The narrative begins with him explaining his goal to J3 as he has this robot go through taste and emotion tests. Eventually, J3 has evolved into a humanoid robot who takes on the traits of George’s wife, leaving the earlier two versions, who all have a sibling-like bond with each other, feeling neglected.
Are taste and emotion examples of necessary elements of creating advanced AI? If so, why? What good does having these abilities serve in terms of the AI’s relationship to the human world? Is it right to transfer consciousness or elements of consciousness from a deceased person into one or several AI? In the AI, how much is too much similarity to a pre-existing person? Can total similarity ever be achieved, and how? Can advanced AI feel negative human emotions and face mental health problems such as depression? Is it ethical to program AI to feel such emotions, knowing the risks associated with them, including bonding with former or flawed prototypes of itself? If an AI kills itself, does the onus fall on the machine or the human creator?
-
- 9 min
- Kinolab
- 2013
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
- Kinolab
- 2013
Martha and Ash Part II: Digital Revival and Human Likeness in Hardware
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Eventually, this program is uploaded onto a robot which has the exact likeness of the deceased Ash. Upon feeling creeped out by the humanoid robot and its imprecision in terms of capturing Ash’s personality, Martha wants nothing more than to keep the robot out of her sight.
How can memories be kept pure when robots are able to impersonate deceased loved ones? If programs and robots such as this can be created, do we truly own our own existence? How can artificial intelligence fail as therapy or companionship? Can artificial intelligence and robotics help comfort people who never even met the deceased? How should an artificial companion be handled by its administrator? Can an animated or robotic humanoid likeness of a person who seemingly has feelings be relegated to the attic as easily as other mementos can?
-
- 3 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Tom and Ben find out that their father Lawrence, the creator of the Feed, harvested the Feeds from dead people and used the data stored therein to upload their consciousnesses, including memories and emotions, into a cloud. After seeing the “training data” of Lawrence creating digital consciousnesses on this program, an AI was able to make many more digital consciousnesses of non-real people. These consciousnesses are then able to “possess” human bodies through being uploaded to the Feed devices implanted in real people’s brains.
- Kinolab
- 2019
Digitally Reproducing Humans and “Possession”
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Tom and Ben find out that their father Lawrence, the creator of the Feed, harvested the Feeds from dead people and used the data stored therein to upload their consciousnesses, including memories and emotions, into a cloud. After seeing the “training data” of Lawrence creating digital consciousnesses on this program, an AI was able to make many more digital consciousnesses of non-real people. These consciousnesses are then able to “possess” human bodies through being uploaded to the Feed devices implanted in real people’s brains.
Do we as humans need the physical world and our bodies, or can we successfully transfer/upload consciousness and live only in the digital space? Is consciousness the same thing as a soul, or different? Are these people discussed in the clip human, AI, or something in between? What are the far-reaching consequences of AI potentially being able to create realistic consciousnesses? How can brain-computer interfaces implanted into a person lead to a complete loss of their own autonomy? Can and should humans choose to donate their consciousnesses, memories, or emotions to science, with ultimately small knowledge of how these may be deployed?