Fairness and Non-discrimination (56)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 10 min
- MIT Technology Review
- 2020
This article explains the ethical warnings of Timnit Gebru against training Natural Language Processing algorithms on large language models developed on sets of textual data from the internet. Not only does this process have a negative environmental impact, it also still does not allow these machine learning tools to process semantic nuance, especially as it relates to burgeoning social movements or countries with lower internet access. Dr. Gebru’s refusal to retract this paper ultimately lead to her dismissal from Google.
- MIT Technology Review
- 2020
-
- 10 min
- MIT Technology Review
- 2020
We read the paper that forced Timnit Gebru out of Google. Here’s what it says.
This article explains the ethical warnings of Timnit Gebru against training Natural Language Processing algorithms on large language models developed on sets of textual data from the internet. Not only does this process have a negative environmental impact, it also still does not allow these machine learning tools to process semantic nuance, especially as it relates to burgeoning social movements or countries with lower internet access. Dr. Gebru’s refusal to retract this paper ultimately lead to her dismissal from Google.
How should models for training NLP algorithms be more closely scrutinized? What sorts of voices are needed at the design table to ensure that the impact of such algorithms are consistent across all populations? Can this ever be achieved?
-
- 7 min
- Slate
- 2021
A new law passed unanimously in Maine heavily restricts the contexts in which facial recognition technology can be deployed, putting significant guardrails around how it is used by law enforcement. Also, it allows citizens to sue if they believe the technology has been misused. This is a unique step in a time when several levels of government, all the way up to the federal government, are less likely to attach strict rules to the use of facial recognition technology, despite the clear bias that is seen in the wake of its use.
- Slate
- 2021
-
- 7 min
- Slate
- 2021
Maine Now Has the Toughest Facial Recognition Restrictions in the U.S.
A new law passed unanimously in Maine heavily restricts the contexts in which facial recognition technology can be deployed, putting significant guardrails around how it is used by law enforcement. Also, it allows citizens to sue if they believe the technology has been misused. This is a unique step in a time when several levels of government, all the way up to the federal government, are less likely to attach strict rules to the use of facial recognition technology, despite the clear bias that is seen in the wake of its use.
How can tech companies do even more to lobby for stricter facial recognition regulation? Is a moratorium on facial recognition use by all levels of government the best plan? Why or why not? Does creating “more diverse datasets” truly solve all the problems of bias with the technology?
-
- 5 min
- Kinolab
- 2019
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Ben, a member of the family who owns the company which created the Feed, uses the augmented reality features to create a virtual version of his ex-wife, Miyu, who he can make indulge in his own fantasies, regardless of what those may be. Eventually, this digital version of Miyu starts to glitch, but Ben nonetheless begins to share this virtual, subservient clone to other people to use in their own fantasies.
- Kinolab
- 2019
VR Intimacy and Objectification
In an imagined future of London, citizens all across the globe are connected to the Feed, a device and network accessed constantly through a brain-computer interface. In this narrative, Ben, a member of the family who owns the company which created the Feed, uses the augmented reality features to create a virtual version of his ex-wife, Miyu, who he can make indulge in his own fantasies, regardless of what those may be. Eventually, this digital version of Miyu starts to glitch, but Ben nonetheless begins to share this virtual, subservient clone to other people to use in their own fantasies.
How are women deprived of autonomy when men are able to control virtual versions of women in their own digital fantasies? How exactly would the consequences of this infect the real world? Is it ethical to use someone’s image and likeness for private purposes without their consent? How can we ‘copyright’ our own image?
-
- 13 min
- Kinolab
- 2016
“Hidden Figures” chronicles the journeys of Katherine Johnson (Taraji P. Henson), Dorothy Vaughan (Octavia Spencer), and Mary Jackson (Janelle Monáe), three black women who worked on the space missions at the Langley Research Center in Hampton, Virginia in 1961. All three women persist against segregation and abject racism as they climb the ladder and make important contributions to the space mission. While Katherine becomes the first black woman on Al Harrison’s Space Task Group, Mary Jackson pursues her dream of becoming an engineer at NASA by petitioning to take courses at an all white school, and Dorothy Vaughan attempts to learn the programming language Fortran in order to ensure that herself and fellow human computers are not replaced by the newest IBM 7090 computer.
- Kinolab
- 2016
Hidden Figures Part I: Goals of Equity and Women of Color in the Workplace
“Hidden Figures” chronicles the journeys of Katherine Johnson (Taraji P. Henson), Dorothy Vaughan (Octavia Spencer), and Mary Jackson (Janelle Monáe), three black women who worked on the space missions at the Langley Research Center in Hampton, Virginia in 1961. All three women persist against segregation and abject racism as they climb the ladder and make important contributions to the space mission. While Katherine becomes the first black woman on Al Harrison’s Space Task Group, Mary Jackson pursues her dream of becoming an engineer at NASA by petitioning to take courses at an all white school, and Dorothy Vaughan attempts to learn the programming language Fortran in order to ensure that herself and fellow human computers are not replaced by the newest IBM 7090 computer.
How is the history of the oppression of Black people in America responsible for a lack of diversity in workplaces, including those involving science and technology in the present? What do technology companies in the current day need to consider in order to ensure that their workforce is diverse and equitable? What does the specific case of Dorothy being initially denied access to the Fortran book reveal about the past and present accessibility of minority groups to fluency in digital technologies? What needs to happen inside of and outside of the technology industry to ensure better opportunities for women of color in technology-focused workplaces?
-
- 7 min
- Vice
- 2019
Programmer creates an application that uses neural networks to remove clothing from the images of women. Deepfake technology being used against women systematically, despite continued narrative that its use in the political realm is the most pressing issue.
- Vice
- 2019
-
- 7 min
- Vice
- 2019
This Horrifying App Undresses a Photo of Any Woman With a Single Click
Programmer creates an application that uses neural networks to remove clothing from the images of women. Deepfake technology being used against women systematically, despite continued narrative that its use in the political realm is the most pressing issue.
How does technology enhance violation of sexual privacy? Who should regulate this technology, and how?
-
- 15 min
- Hidden Switch
- 2018
A hands-on learning experience about the algorithms used in dating apps through the perspective of a created monster avatar.
- Hidden Switch
- 2018
-
- 15 min
- Hidden Switch
- 2018
Monster Match
A hands-on learning experience about the algorithms used in dating apps through the perspective of a created monster avatar.
How do algorithms in dating apps work? What gaps seemed most prominent to you? What upset you most about the way this algorithm defined you and the choices it offered to you?