Bioinformatics (86)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- New York Times
- 2018
This article details the research of Joy Buolamwini on racial bias coded into algorithms, specifically facial recognition programs. When auditing facial recognition software from several large companies such as IBM and Face++, she found that they are far worse at properly identifying darker skinned faces. Overall, this reveals that facial analysis and recognition programs are in need of exterior systems of accountability.
- New York Times
- 2018
-
- 7 min
- New York Times
- 2018
Facial Recognition Is Accurate, if You’re a White Guy
This article details the research of Joy Buolamwini on racial bias coded into algorithms, specifically facial recognition programs. When auditing facial recognition software from several large companies such as IBM and Face++, she found that they are far worse at properly identifying darker skinned faces. Overall, this reveals that facial analysis and recognition programs are in need of exterior systems of accountability.
What does exterior accountability for facial recognition software look like, and what should it look like? How and why does racial bias get coded into technology, whether explicitly or implicitly?
-
- 7 min
- The Verge
- 2020
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
- The Verge
- 2020
-
- 7 min
- The Verge
- 2020
What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
What potential harms could you see from the misapplication of the PULSE algorithm? What sorts of bias-mitigating solutions besides more diverse data sets could you envision? Based on this case study, what sorts of real-world applications should facial recognition technology be trusted with?
-
- 10 min
- The Atlantic
- 2014
When the Apple Health app first released, it lacked one crucial component: the ability to track menstrual cycles. This exclusion of women from accessible design of technology is not the exception but rather the rule. This results from problems inherent to the gender imbalance in technology workplaces, especially at the level of design. Communities such as the Quantified Self offer spaces to help combat this exclusive culture.
- The Atlantic
- 2014
-
- 10 min
- The Atlantic
- 2014
How Self-Tracking Apps Exclude Women
When the Apple Health app first released, it lacked one crucial component: the ability to track menstrual cycles. This exclusion of women from accessible design of technology is not the exception but rather the rule. This results from problems inherent to the gender imbalance in technology workplaces, especially at the level of design. Communities such as the Quantified Self offer spaces to help combat this exclusive culture.
In what ways are women being left behind by personal data tracking apps, and how can this be fixed? How can design strategies and institutions in technology development be inherently sexist? What will it take to ensure glaring omissions such as this one do not occur in other future products? How can apps that track and promote certain behaviors avoid being patronizing or patriarchal?
-
- 30 min
- UNIVERSITY OF WÜRZBURG GRADUATE SCHOOLS
- 1982
Hardware specialist Automatic Jack is roped into a dangerous hacking scheme with his partner Bobby Quine while they compete for the affections of Rikki. Their plan is to use deadly malware to infiltrate the protections of Chrome, a mysterious overlord of cyberspace who hoards massive amounts of wealth. They enact this plan by entering cyberspace within a program and visualizing the data held within this digital network which connects people all across the globe.
- UNIVERSITY OF WÜRZBURG GRADUATE SCHOOLS
- 1982
-
- 30 min
- UNIVERSITY OF WÜRZBURG GRADUATE SCHOOLS
- 1982
Cyberspace and Internet Imaginations: “Burning Chrome” by William Gibson
Hardware specialist Automatic Jack is roped into a dangerous hacking scheme with his partner Bobby Quine while they compete for the affections of Rikki. Their plan is to use deadly malware to infiltrate the protections of Chrome, a mysterious overlord of cyberspace who hoards massive amounts of wealth. They enact this plan by entering cyberspace within a program and visualizing the data held within this digital network which connects people all across the globe.
How can malware be used for good, and when should it be used for good? How do imaginations of the internet influence how people perceive this mysterious yet pervasive network? In what ways would making aspects of the internet into tangible images help people understand it better? How should the most powerful stakeholders in a given digital architecture be challenged? How might immersion into cyberspace give people more agency?
-
- 20 min
- MIT Press
- 2018
Lilith, a contract laborer, ends up in a dangerous situation when the self-driving ship she rides malfunctions. Kyleen, a human who has undergone a human-editing networking process called “meshing,” is able to control a proxy robot via a brain-computer interface to help Lilith get to her destination safely.
- MIT Press
- 2018
-
- 20 min
- MIT Press
- 2018
Robotic Proxies and Telepresence: “Different Seas” by Alastair Reynolds
Lilith, a contract laborer, ends up in a dangerous situation when the self-driving ship she rides malfunctions. Kyleen, a human who has undergone a human-editing networking process called “meshing,” is able to control a proxy robot via a brain-computer interface to help Lilith get to her destination safely.
How can robotic proxies be helpful to people in danger? Who should be allowed or certified to operate these, in theory? How might these be implicated in inequitable class structures, as outlined in the story? Should humans be networked with machines, and would this really be to the ultimate benefit of humanity?
-
- 2 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Max, a citizen whose Feed was hacked, has to get the device removed from his body as his best friends watch. This procedure includes the removal of some of his memories from both his brain and from the device, although they manage to upload these into a cloud.
- Kinolab
- 2019
Implanted Technology and Disconnection
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Max, a citizen whose Feed was hacked, has to get the device removed from his body as his best friends watch. This procedure includes the removal of some of his memories from both his brain and from the device, although they manage to upload these into a cloud.
What are the risks involved with brain-computer interfaces, especially when we need to ‘remove’ them from our brains? How might this increase medical costs? How can memory and consciousness be ‘backed up’ and ‘uploaded’ back into our bodies using advanced technology?