Machine Learning (83)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- Kinolab
- 2013
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Through the chat bot, Ash essentially goes on living, as he is able to respond to Martha and grow as more memories are shared with the program.
- Kinolab
- 2013
Martha and Ash Part I: Digital Revival and Human Likeness in Software
At some point in the near future, Martha’s husband Ash dies in a car accident. In order to help Martha through the grieving process, her friend Sara gives Ash’s data to a company which can create an artificial intelligence program to simulate text and phone conversations between Martha and Ash. Through the chat bot, Ash essentially goes on living, as he is able to respond to Martha and grow as more memories are shared with the program.
How should programs like this be deployed? Who should be in charge of them? Do our online interactions abstract our entire personality? Could this be validly used for therapy purposes, or is any existence of such software dangerous? Is it ethical to provide such a tangible way of disconnecting from reality, and are these interactions truly all that different from something like social media interactions?
-
- 15 min
- Hidden Switch
- 2018
A hands-on learning experience about the algorithms used in dating apps through the perspective of a created monster avatar.
- Hidden Switch
- 2018
-
- 15 min
- Hidden Switch
- 2018
Monster Match
A hands-on learning experience about the algorithms used in dating apps through the perspective of a created monster avatar.
How do algorithms in dating apps work? What gaps seemed most prominent to you? What upset you most about the way this algorithm defined you and the choices it offered to you?
-
- 3 min
- Vimeo: Shalini Kantayya
- 2020
A brief visual example of an application of computer vision for facial recognition, how these algorithms can be trained to recognized faces, and the dangers that come with biased data sets, such as a disproportionate amount of white men.
- Vimeo: Shalini Kantayya
- 2020
Coded Bias: How Ignorance Enters Computer Vision
A brief visual example of an application of computer vision for facial recognition, how these algorithms can be trained to recognized faces, and the dangers that come with biased data sets, such as a disproportionate amount of white men.
When thinking about computer vision in relation to projects such as the Aspire Mirror, what sorts of individual and systemic consequences arise for those who have faces that biased computer vision programs do not easily recognize?
-
- 5 min
- MIT Technology Review
- 2021
The company Datagen serves as an example of a business which sells synthetic human faces (based on real scans) to other companies to use as training data for AI.
- MIT Technology Review
- 2021
-
- 5 min
- MIT Technology Review
- 2021
These creepy fake humans herald a new age in AI
The company Datagen serves as an example of a business which sells synthetic human faces (based on real scans) to other companies to use as training data for AI.
Does it seem likely that synthetic human data has the power to combat bias, or could it just introduce more bias? Does this represent putting too much trust in machines?
-
- 10 min
- The New York Times
- 2019
Databases of people’s faces are being compiled without their knowledge by companies and researchers (including social media companies or dating sites), with many shared around the world and fueling the advancement of facial recognition technology.
- The New York Times
- 2019
-
- 10 min
- The New York Times
- 2019
Facial Recognition Tech is Growing Stronger, Thanks to Your Face
Databases of people’s faces are being compiled without their knowledge by companies and researchers (including social media companies or dating sites), with many shared around the world and fueling the advancement of facial recognition technology.
How comfortable would you feel knowing that your face is in various databases and is being use, in some cases, to fuel their machine learning algorithms? As of right now, Google and Facebook, who are said to have the largest facial databases of all, do not share their information, but might they? And what would happen if they did?
-
- 7 min
- The New York Times
- 2019
Biometric facial recognition software, specifically that used with arrest photos in the NYPD, makes extensive use of children’s arrest photos despite a far lower accuracy rate.
- The New York Times
- 2019
-
- 7 min
- The New York Times
- 2019
She Was Arrested at 14. Then Her Photo Went to a Biometrics Database
Biometric facial recognition software, specifically that used with arrest photos in the NYPD, makes extensive use of children’s arrest photos despite a far lower accuracy rate.
How can machine learning algorithms cause inequality to compound? Would it be better practice to try to make facial recognition equitable across all populations, or to abandon its use in law enforcement altogether, as some cities like Oakland have done?