News Article (130)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- The Verge
- 2020
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
- The Verge
- 2020
-
- 7 min
- The Verge
- 2020
What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
What potential harms could you see from the misapplication of the PULSE algorithm? What sorts of bias-mitigating solutions besides more diverse data sets could you envision? Based on this case study, what sorts of real-world applications should facial recognition technology be trusted with?
-
- 7 min
- Wall Street Journal
- 2021
Google’s new Pixel 6 smartphone claims to have “the world’s most inclusive camera” based on its purported ability to more accurately reflect darker skin tones in photographs, a form of digital justice notably absent from previous iterations of computational photography across the phones of various tech monopolies.
- Wall Street Journal
- 2021
-
- 7 min
- Wall Street Journal
- 2021
Google Built the Pixel 6 Camera to Better Portray People With Darker Skin Tones. Does It?
Google’s new Pixel 6 smartphone claims to have “the world’s most inclusive camera” based on its purported ability to more accurately reflect darker skin tones in photographs, a form of digital justice notably absent from previous iterations of computational photography across the phones of various tech monopolies.
How can “arms races” between different tech monopolies potentially lead to positive innovations, especially those that center equity? Why did it take so long to have a more inclusive camera? How can a camera be exclusive?
-
- 10 min
- Gizmodo
- 2021
Physicist Brian Nord, who learned about deep learning algorithms through his research on the cosmos, warns against how developing algorithms without proper ethical sensibility can lead to these algorithms having more negative impacts than positive ones. Essentially, an “a priori” or proactive approach to instilling AI ethical sensibility, whether through review institutions or ethical education of developers, is needed to guard against privileged populations using algorithms to maintain hegemony.
- Gizmodo
- 2021
-
- 10 min
- Gizmodo
- 2021
Developing Algorithms That Might One Day Be Used Against You
Physicist Brian Nord, who learned about deep learning algorithms through his research on the cosmos, warns against how developing algorithms without proper ethical sensibility can lead to these algorithms having more negative impacts than positive ones. Essentially, an “a priori” or proactive approach to instilling AI ethical sensibility, whether through review institutions or ethical education of developers, is needed to guard against privileged populations using algorithms to maintain hegemony.
What would an ideal algorithmic accountability organization or process look like? What specific ethical regions should AI developers study before creating their algorithms? How can algorithms or other programs created for one context, such as scientific research or learning, be misused in other contexts?
-
- 5 min
- Tech Crunch
- 2020
During Google’s attempt to merge with the company Fitbit, the NGO Amnesty International has provided warnings to the competition regulators in the EU that such a move would be detrimental to privacy. Based on Google’s historical malpractice with user data, since its status as a tech monopoly allows it to mine data from several different avenues of a user’s life, adding wearable health-based tech to this equation puts the privacy and rights of users at risk. Calls for scrunity of “surveillance capitalism” employed by tech giants.
- Tech Crunch
- 2020
-
- 5 min
- Tech Crunch
- 2020
No Google-Fitbit merger without human rights remedies, says Amnesty to EU
During Google’s attempt to merge with the company Fitbit, the NGO Amnesty International has provided warnings to the competition regulators in the EU that such a move would be detrimental to privacy. Based on Google’s historical malpractice with user data, since its status as a tech monopoly allows it to mine data from several different avenues of a user’s life, adding wearable health-based tech to this equation puts the privacy and rights of users at risk. Calls for scrunity of “surveillance capitalism” employed by tech giants.
When considering how companies and advertisers may use them, what sorts of personal statistics related to health and well-being should and should not be collected by mobile computing devices? How can devices originally built to stand on their own as one technological artifact become more convenient or harmful to a user when they become part of a technological architecture?
-
- 3 min
- Tech Crunch
- 2020
This narrative explains that the push for technology to help with accessibility for disabled groups, especially blind or visually impaired individuals, has spurred scientific innovation which is to the benefit of everyone.
- Tech Crunch
- 2020
-
- 3 min
- Tech Crunch
- 2020
What will tomorrow’s tech look like? Ask someone who can’t see.
This narrative explains that the push for technology to help with accessibility for disabled groups, especially blind or visually impaired individuals, has spurred scientific innovation which is to the benefit of everyone.
What are the benefits of developing technologies and innovations which aim to solve a specific problem? How might this lead to unprecedented positive innovations? How can accessibility become a priority, and become adequately incentivized, in tech development, instead of other priorities such as profit?
-
- 10 min
- New York Times
- 2019
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
- New York Times
- 2019
-
- 10 min
- New York Times
- 2019
As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
What are the consequences of employing biased technologies to survey citizens? Who loses agency, and who gains agency?