Themes (326)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 4 min
- TechCrunch
- 2021
On the day of the January 6th insurrection at the U.S Capitol, social media proved to be a valuable tool for telling the narrative of the horrors taking place within the Capitol building. At the same time, social media plays a large role in political polarization, as users can end up on fringe sites where content is tailored to their beliefs and not always true.
- TechCrunch
- 2021
-
- 4 min
- TechCrunch
- 2021
Social media allowed a shocked nation to watch a coup attempt in real time
On the day of the January 6th insurrection at the U.S Capitol, social media proved to be a valuable tool for telling the narrative of the horrors taking place within the Capitol building. At the same time, social media plays a large role in political polarization, as users can end up on fringe sites where content is tailored to their beliefs and not always true.
How can social media platforms be redesigned or regulated to crack down more harshly on misinformation and extremism? How much can social media be valued as a set of platforms that “help tell the true story of an event” when they also allow mass denial of objective fact? Who should be responsible for shutting down fringe sites, and how should this happen?
-
- 12 min
- Wired
- 2018
This video offers a basic introduction to the use of machine learning in predictive policing, and how this disproportionately affects low income communities and communities of color.
- Wired
- 2018
How Cops Are Using Algorithms to Predict Crimes
This video offers a basic introduction to the use of machine learning in predictive policing, and how this disproportionately affects low income communities and communities of color.
Should algorithms ever be used in a context where human bias is already rampant, such as in police departments? Why is it that the use of digital technologies to accomplish tasks in this age makes a process seem more “efficient” or “objective”? What are the problems with police using algorithms of which they do not fully understand the inner workings? Is the use of predictive policing algorithms ever justifiable?
-
- 3 min
- CNN
- 2021
The prominence of social data on any given person afforded by digital artifacts, such as social media posts and text messages, can be used to train a new algorithm patented by Microsoft to create a chatbot meant to imitate that specific person. This technology has not been released, however, due to its harrowing ethical implications of impersonation and dissonance. For the Black Mirror episode referenced in the article, see the narratives “Martha and Ash Parts I and II.”
- CNN
- 2021
-
- 3 min
- CNN
- 2021
Microsoft patented a chatbot that would let you talk to dead people. It was too disturbing for production
The prominence of social data on any given person afforded by digital artifacts, such as social media posts and text messages, can be used to train a new algorithm patented by Microsoft to create a chatbot meant to imitate that specific person. This technology has not been released, however, due to its harrowing ethical implications of impersonation and dissonance. For the Black Mirror episode referenced in the article, see the narratives “Martha and Ash Parts I and II.”
How do humans control their identity when it can be replicated through machine learning? What sorts of quirks and mannerisms are unique to humans and cannot be replicated by an algorithm?
-
- 30 min
- UNIVERSITY OF WÜRZBURG GRADUATE SCHOOLS
- 1982
Hardware specialist Automatic Jack is roped into a dangerous hacking scheme with his partner Bobby Quine while they compete for the affections of Rikki. Their plan is to use deadly malware to infiltrate the protections of Chrome, a mysterious overlord of cyberspace who hoards massive amounts of wealth. They enact this plan by entering cyberspace within a program and visualizing the data held within this digital network which connects people all across the globe.
- UNIVERSITY OF WÜRZBURG GRADUATE SCHOOLS
- 1982
-
- 30 min
- UNIVERSITY OF WÜRZBURG GRADUATE SCHOOLS
- 1982
Cyberspace and Internet Imaginations: “Burning Chrome” by William Gibson
Hardware specialist Automatic Jack is roped into a dangerous hacking scheme with his partner Bobby Quine while they compete for the affections of Rikki. Their plan is to use deadly malware to infiltrate the protections of Chrome, a mysterious overlord of cyberspace who hoards massive amounts of wealth. They enact this plan by entering cyberspace within a program and visualizing the data held within this digital network which connects people all across the globe.
How can malware be used for good, and when should it be used for good? How do imaginations of the internet influence how people perceive this mysterious yet pervasive network? In what ways would making aspects of the internet into tangible images help people understand it better? How should the most powerful stakeholders in a given digital architecture be challenged? How might immersion into cyberspace give people more agency?
-
- 2 min
- azfamily.com
- 2018
Facial recognition technology has found a new application: reuniting dogs with their owners. A simple machine learning algorithm takes a photo of a dog and crawls through a database of photos of dogs in shelters in hopes of finding a match.
- azfamily.com
- 2018
-
- 2 min
- azfamily.com
- 2018
Facial recognition technology now used in Phoenix area to locate lost dogs
Facial recognition technology has found a new application: reuniting dogs with their owners. A simple machine learning algorithm takes a photo of a dog and crawls through a database of photos of dogs in shelters in hopes of finding a match.
How could this beneficial use of recognition technology find even broader use?
-
- 7 min
- The Verge
- 2020
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
- The Verge
- 2020
-
- 7 min
- The Verge
- 2020
What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
What potential harms could you see from the misapplication of the PULSE algorithm? What sorts of bias-mitigating solutions besides more diverse data sets could you envision? Based on this case study, what sorts of real-world applications should facial recognition technology be trusted with?