Government’s evolving ability to silence dissenting voices using technology.
Digital Censorship (5)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- Kinolab
- 2017
Single mother Marie pays to have Arkangel, a brain-computer interface, installed into her daughter Sara. With this implant, Marie is able to not only track Sara’s location at all times, but can also access a feed of the audiovisual data which Sara is experiencing at any moment. Marie also has the power to censor this sensory input, controlling what Sara sees and hiding stressful stimuli from her view. Of course, this eventually has negative impacts on Sara’s psychology and social life.
- Kinolab
- 2017
Marie and Sara Part I: Helicopter Parenting and Child Development
Single mother Marie pays to have Arkangel, a brain-computer interface, installed into her daughter Sara. With this implant, Marie is able to not only track Sara’s location at all times, but can also access a feed of the audiovisual data which Sara is experiencing at any moment. Marie also has the power to censor this sensory input, controlling what Sara sees and hiding stressful stimuli from her view. Of course, this eventually has negative impacts on Sara’s psychology and social life.
Should parents be allowed to use digital technology to censor or filter the world which surrounds their child? How can parental controls decrease exposure and affect development of children? How would this technology negatively impact the pyschology and societal expectations surrounding parenting? Should anyone ever have the power to edit someone’s brain, and thus their perception of reality? In what ways to digital technologies make it easier and harder to shelter children?
-
- 2 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Tom, the son of the Feed’s creator Lawrence, realizes that his father had deleted some of his childhood memories from the device in his brain, thus Tom has lost all access to them. For further insights into technology and the nature of parent-child relationships, see the narratives “Marie and Sara Parts I and II.”
- Kinolab
- 2019
Personal Control over Memories
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. Tom, the son of the Feed’s creator Lawrence, realizes that his father had deleted some of his childhood memories from the device in his brain, thus Tom has lost all access to them. For further insights into technology and the nature of parent-child relationships, see the narratives “Marie and Sara Parts I and II.”
What rights do parents have over the minds and bodies of their children? Should parents ever be able to alter the memories of their children, even if this is supposedly for their own good? What are the consequences of the externalisation of memory through digital technology? How should children be able to give consent for alterations to technological implants?
-
- 16 min
- Kinolab
- 2004
Joel Barish recently broke up with Clementine, his girlfriend of two years, in a brutal argument. After discovering that she has used a procedure known as Lacuna to erase him from her memories, Joel decides to undergo the same procedure to forget that he ever knew Clementine. The procedure uses a brain-computer interface to map the areas of Joel’s brain that are active whenever he has a memory of Clementine, first when he is awake and using associated objects to perform active recall and then when he is asleep and subconsciously remembering her. Despite Joel’s eventual regrets and desperate attempts to remember Clementine, the procedure is successful, and he forgets her. However, Joel and Clementine reunite in the real world after their respective procedures, and as they have a fresh start, they end up listening to Clementine’s tape from before the procedure where she dissects all of the flaws of Joel and their relationship.
- Kinolab
- 2004
Digital Memory Erasure and Brain Mapping
Joel Barish recently broke up with Clementine, his girlfriend of two years, in a brutal argument. After discovering that she has used a procedure known as Lacuna to erase him from her memories, Joel decides to undergo the same procedure to forget that he ever knew Clementine. The procedure uses a brain-computer interface to map the areas of Joel’s brain that are active whenever he has a memory of Clementine, first when he is awake and using associated objects to perform active recall and then when he is asleep and subconsciously remembering her. Despite Joel’s eventual regrets and desperate attempts to remember Clementine, the procedure is successful, and he forgets her. However, Joel and Clementine reunite in the real world after their respective procedures, and as they have a fresh start, they end up listening to Clementine’s tape from before the procedure where she dissects all of the flaws of Joel and their relationship.
Is it possible to completely forget and event or a person in the digital age, or is there always the possibility that traces will remain? Do digital technologies hold memories well enough, or is there something more abstract about these memories that they cannot capture? How could the technology displayed here be abused? Does pervasive digital memory of people and events ever allow us to feel completely neutral about another person, and is the a departure from the pre-digital age? Do humans have an over-reliance on digital memory? How have relationships changed with the advent of digital memory?
-
- 4 min
- Reuters
- 2020
Facebook has a new independent Oversight Board to help moderate content on the site, picking individual cases from the many presented to them where it is alright to remove content. The cases usually deal in hate speech, “inappropriate visuals,” or misinformation.
- Reuters
- 2020
-
- 4 min
- Reuters
- 2020
From hate speech to nudity, Facebook’s oversight board picks its first cases
Facebook has a new independent Oversight Board to help moderate content on the site, picking individual cases from the many presented to them where it is alright to remove content. The cases usually deal in hate speech, “inappropriate visuals,” or misinformation.
How much oversight do algorithms or networks with a broad impact need? Who all needs to be in a room when deciding what an algorithm or site should or should not allow? Can algorithms be designed to detect and remove hate speech? Should such an algorithm exist?
- Wired
- 2021
Youtube algorithm’s struggle to distinguish chess-related terms from hate speech and abuse has revealed shortcomings in artificial intelligence’s ability to moderate online hate speech. The incident reflects the need to develop digital technologies capable of processing natural languages with a sufficient degree of social sensitivity.
- Wired
- 2021
- Wired
- 2021
Why a YouTube Chat About Chess Got Flagged for Hate Speech
Youtube algorithm’s struggle to distinguish chess-related terms from hate speech and abuse has revealed shortcomings in artificial intelligence’s ability to moderate online hate speech. The incident reflects the need to develop digital technologies capable of processing natural languages with a sufficient degree of social sensitivity.
Where do you draw the line between freedom of speech and online community conduct and regulations? What are some problems you think AI will experience in moderating hate speech like slurs?