Networking, Capital, and Cloud Computing (58)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 10 min
- New York Times
- 2019
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
- New York Times
- 2019
-
- 10 min
- New York Times
- 2019
As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
What are the consequences of employing biased technologies to survey citizens? Who loses agency, and who gains agency?
-
- 11 min
- Kinolab
- 2015
During a manned mission to Mars, Astronaut Mark Watney is presumed dead after a fierce storm and left behind by his crew. But Watney has survived and finds himself stranded and alone on the hostile planet. With only meager supplies, he must draw upon his ingenuity, wit and spirit to subsist and find a way to signal to Earth that he is alive. Communication between Earth and space happens primarily through data streaming methods, such as video chats or satellite broadcasts. In the second part of this narrative, countries across the globe, specifically the U.S and China, work together to engineer a plan to get Mark Watney back on board the Hermes ship. While there are complications, Mark is eventually reunited with his crew.
- Kinolab
- 2015
Mars Rescue Part II: Global Alliances and Human Connection
During a manned mission to Mars, Astronaut Mark Watney is presumed dead after a fierce storm and left behind by his crew. But Watney has survived and finds himself stranded and alone on the hostile planet. With only meager supplies, he must draw upon his ingenuity, wit and spirit to subsist and find a way to signal to Earth that he is alive. Communication between Earth and space happens primarily through data streaming methods, such as video chats or satellite broadcasts. In the second part of this narrative, countries across the globe, specifically the U.S and China, work together to engineer a plan to get Mark Watney back on board the Hermes ship. While there are complications, Mark is eventually reunited with his crew.
Does space travel and exploration seem like a good use of scientific or technological capital? Is it too dangerous of a frontier to dispense so many technological resources on? How is the development of data streaming methods over long distances depicted positively here? How does technological innovation have the potential to spur global alliances? Is spending significant time and money on technology and innovation worthwhile if it leads to global cooperations?
-
- 13 min
- Kinolab
- 2002
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens.
- Kinolab
- 2002
Preventative Policing and Surveillance Information
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens.
How will predicted crime be prosecuted? Should predicted crime be prosecuted? How could technologies such as the ones shown here be affected for the worse by human bias? How would these devices make racist policing practices even worse? Would certain communities be targeted? Is there ever any justification for constant civil surveillance?
-
- 9 min
- Kinolab
- 2002
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens. In this narrative, the PreCrime program is audited, and the officers must explain the ethics and philosophies at play behind their systems. After captain John Anderton is accused of a future crime, he flees, and learns of “minority reports,” or instances of disagreement between the Precogs covered up by the department to make the justice system seem infallible.
- Kinolab
- 2002
Trusting Machines and Variable Outcomes
In the year 2054, the PreCrime police program is about to go national. At PreCrime, three clairvoyant humans known as “PreCogs” are able to forecast future murders by streaming audiovisual data which provides the surrounding details of the crime, including the names of the victims and perpetrators. Although there are no cameras, the implication is that anyone can be under constant surveillance by this program. Once the “algorithm” has gleaned enough data about the future crime, officers move out to stop the murder before it happens. In this narrative, the PreCrime program is audited, and the officers must explain the ethics and philosophies at play behind their systems. After captain John Anderton is accused of a future crime, he flees, and learns of “minority reports,” or instances of disagreement between the Precogs covered up by the department to make the justice system seem infallible.
What are the problems with taking the results of computer algorithms as infallible or entirely objective? How are such systems prone to bias, especially when two different algorithms might make two different predictions? Is there any way that algorithms could possibly make the justice system more fair? How might humans inflect the results of a predictive crime algorithm in order to serve themselves? Does technology, especially an algorithm such as a crime predictor, need to be made more transparent to its users and the general public so that people do not trust it with a religious sort of fervor?
-
- 9 min
- Kinolab
- 2016
Eleanor Shellstrop, a deceased selfish woman, ended up in the utopic afterlife The Good Place by mistake after her death. She spins an elaborate web of lies to ensure that she is not sent to be tortured in The Bad Place. In this narrative, the demons of the Bad Place try to wrest Eleanor’s soul away from the Good Place by convincing her that this is where she truly belongs. This resonates with Eleanor, who was always a lone wolf and never found a community of people who she liked. Ultimately, though, she fights to stay in the Good Place because of the fondness she has for the community of people who she knows there.
- Kinolab
- 2016
Community and Belonging
Eleanor Shellstrop, a deceased selfish woman, ended up in the utopic afterlife The Good Place by mistake after her death. She spins an elaborate web of lies to ensure that she is not sent to be tortured in The Bad Place. In this narrative, the demons of the Bad Place try to wrest Eleanor’s soul away from the Good Place by convincing her that this is where she truly belongs. This resonates with Eleanor, who was always a lone wolf and never found a community of people who she liked. Ultimately, though, she fights to stay in the Good Place because of the fondness she has for the community of people who she knows there.
Can our desire to be better outweigh our past actions? How do digital technologies help people find communities where they feel they belong? Does the intention to improve as a person matter just as much as actually improving as a person?
-
- 6 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Kate Hatfield, a new mother, discovers that someone has hacked into the device in her head, and thus was able to access some of her lived memories. Later, the culprit of this hack is revealed to be her father-in-law Lawrence, who was attempting to implant the Feed into Bea, the new baby.
- Kinolab
- 2019
Consent and Control with Personal Data
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Kate Hatfield, a new mother, discovers that someone has hacked into the device in her head, and thus was able to access some of her lived memories. Later, the culprit of this hack is revealed to be her father-in-law Lawrence, who was attempting to implant the Feed into Bea, the new baby.
What are the dangers that come with ‘backing up’ memory to some type of cloud account? What risks are posed by hackers and corporations that run such backing up services? Is there something special about the transient, temporary nature of human memory that should remain as it is? How much of our privacy are we willing to sacrifice in order for safety/connectivity? How should consent work in terms of installing a brain-computer interface into a person? Should a parent or other family member be able to decide this for a child?