Fairness and Non-discrimination (56)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 10 min
- The Atlantic
- 2014
When the Apple Health app first released, it lacked one crucial component: the ability to track menstrual cycles. This exclusion of women from accessible design of technology is not the exception but rather the rule. This results from problems inherent to the gender imbalance in technology workplaces, especially at the level of design. Communities such as the Quantified Self offer spaces to help combat this exclusive culture.
- The Atlantic
- 2014
-
- 10 min
- The Atlantic
- 2014
How Self-Tracking Apps Exclude Women
When the Apple Health app first released, it lacked one crucial component: the ability to track menstrual cycles. This exclusion of women from accessible design of technology is not the exception but rather the rule. This results from problems inherent to the gender imbalance in technology workplaces, especially at the level of design. Communities such as the Quantified Self offer spaces to help combat this exclusive culture.
In what ways are women being left behind by personal data tracking apps, and how can this be fixed? How can design strategies and institutions in technology development be inherently sexist? What will it take to ensure glaring omissions such as this one do not occur in other future products? How can apps that track and promote certain behaviors avoid being patronizing or patriarchal?
-
- 5 min
- Indie Wire
- 2021
New virtual exhibits displayed through Web XR, or Extended Reality available over the network of internet browsers, allow Black artists and creators to present ancestral knowledge and stories while providing a new basis on which AI could be trained. This use of AI leads to an imagination free of colonial or racist constructs that may otherwise be present in digital media.
- Indie Wire
- 2021
-
- 5 min
- Indie Wire
- 2021
How Black Storytellers Are Using XR and Afro-Futurism to Explore Ancestral Identity
New virtual exhibits displayed through Web XR, or Extended Reality available over the network of internet browsers, allow Black artists and creators to present ancestral knowledge and stories while providing a new basis on which AI could be trained. This use of AI leads to an imagination free of colonial or racist constructs that may otherwise be present in digital media.
How does artificial intelligence and augmented reality open doors for expression of minority voices? How can digital art be used to make a specific statement or call for a cultural shift? What are the benefits of applying wisdom from across the globe and before the digital age into the design and deployment of digital technologies?
-
- 10 min
- Gizmodo
- 2021
Physicist Brian Nord, who learned about deep learning algorithms through his research on the cosmos, warns against how developing algorithms without proper ethical sensibility can lead to these algorithms having more negative impacts than positive ones. Essentially, an “a priori” or proactive approach to instilling AI ethical sensibility, whether through review institutions or ethical education of developers, is needed to guard against privileged populations using algorithms to maintain hegemony.
- Gizmodo
- 2021
-
- 10 min
- Gizmodo
- 2021
Developing Algorithms That Might One Day Be Used Against You
Physicist Brian Nord, who learned about deep learning algorithms through his research on the cosmos, warns against how developing algorithms without proper ethical sensibility can lead to these algorithms having more negative impacts than positive ones. Essentially, an “a priori” or proactive approach to instilling AI ethical sensibility, whether through review institutions or ethical education of developers, is needed to guard against privileged populations using algorithms to maintain hegemony.
What would an ideal algorithmic accountability organization or process look like? What specific ethical regions should AI developers study before creating their algorithms? How can algorithms or other programs created for one context, such as scientific research or learning, be misused in other contexts?
-
- 7 min
- Wall Street Journal
- 2021
Google’s new Pixel 6 smartphone claims to have “the world’s most inclusive camera” based on its purported ability to more accurately reflect darker skin tones in photographs, a form of digital justice notably absent from previous iterations of computational photography across the phones of various tech monopolies.
- Wall Street Journal
- 2021
-
- 7 min
- Wall Street Journal
- 2021
Google Built the Pixel 6 Camera to Better Portray People With Darker Skin Tones. Does It?
Google’s new Pixel 6 smartphone claims to have “the world’s most inclusive camera” based on its purported ability to more accurately reflect darker skin tones in photographs, a form of digital justice notably absent from previous iterations of computational photography across the phones of various tech monopolies.
How can “arms races” between different tech monopolies potentially lead to positive innovations, especially those that center equity? Why did it take so long to have a more inclusive camera? How can a camera be exclusive?
-
- 7 min
- The Verge
- 2020
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
- The Verge
- 2020
-
- 7 min
- The Verge
- 2020
What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
What potential harms could you see from the misapplication of the PULSE algorithm? What sorts of bias-mitigating solutions besides more diverse data sets could you envision? Based on this case study, what sorts of real-world applications should facial recognition technology be trusted with?
-
- 7 min
- New York Times
- 2018
This article details the research of Joy Buolamwini on racial bias coded into algorithms, specifically facial recognition programs. When auditing facial recognition software from several large companies such as IBM and Face++, she found that they are far worse at properly identifying darker skinned faces. Overall, this reveals that facial analysis and recognition programs are in need of exterior systems of accountability.
- New York Times
- 2018
-
- 7 min
- New York Times
- 2018
Facial Recognition Is Accurate, if You’re a White Guy
This article details the research of Joy Buolamwini on racial bias coded into algorithms, specifically facial recognition programs. When auditing facial recognition software from several large companies such as IBM and Face++, she found that they are far worse at properly identifying darker skinned faces. Overall, this reveals that facial analysis and recognition programs are in need of exterior systems of accountability.
What does exterior accountability for facial recognition software look like, and what should it look like? How and why does racial bias get coded into technology, whether explicitly or implicitly?