Machine Learning (83)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 5 min
- MIT Technology Review
- 2019
Introduction to how bias is introduced in algorithms during the data preparation stage, which involves selecting which attributes you want the algorithm to consider. Underlines the difficult nature of ameliorating bias in machine learning, given that algorithms are not always perfectly attuned to human social contexts.
- MIT Technology Review
- 2019
-
- 5 min
- MIT Technology Review
- 2019
This is how AI bias really happens—and why it’s so hard to fix
Introduction to how bias is introduced in algorithms during the data preparation stage, which involves selecting which attributes you want the algorithm to consider. Underlines the difficult nature of ameliorating bias in machine learning, given that algorithms are not always perfectly attuned to human social contexts.
How can the “portability trap” described in the article be avoided? Who should be involved in making decisions about framing problems that AI are meant to solve?
-
- 5 min
- Wired
- 2019
Monster Match, a game funded by Mozilla, shows how dating app algorithms are reinforcing bias through combining personal and mass aggregated data to systematically hide a vast number of profiles from user sight, effectively caging users into narrow preferences.
- Wired
- 2019
-
- 5 min
- Wired
- 2019
This dating app exposes the monstrous bias of algorithms
Monster Match, a game funded by Mozilla, shows how dating app algorithms are reinforcing bias through combining personal and mass aggregated data to systematically hide a vast number of profiles from user sight, effectively caging users into narrow preferences.
What are some inexplicit ways in which algorithms reinforce biases? Are machine learning algorithms equipped to handle the multiple confounding variables at play in things like dating preferences? Does online dating unquestionably give people more agency in finding a partner?
-
- 5 min
- Wall Street Journal
- 2019
Incorporation of ethical practices and outside perspectives in AI companies for bias prevention is beneficial, and becoming more popular. Spawns from a need for consistent human oversight in algorithms.
- Wall Street Journal
- 2019
-
- 5 min
- Wall Street Journal
- 2019
Investors Urge AI Startups to Inject Early Dose of Ethics
Incorporation of ethical practices and outside perspectives in AI companies for bias prevention is beneficial, and becoming more popular. Spawns from a need for consistent human oversight in algorithms.
How do we have an ethical guardrail around AI? How should tech companies approach gathering outside perspectives on algorithms?
-
- 5 min
- Time Magazine
- 2017
Chicago police enact an algorithm to calculate a “risk score” for individuals based on factors such as criminal history and age with the aim of assessing and pre-emptively striking against risk. However, these numbers are inherently linked to human bias both in input and outcome, and could lead to unfair targeted of citizens, even as it supposedly introduces objectivity to the system.
- Time Magazine
- 2017
-
- 5 min
- Time Magazine
- 2017
The Police Are Using Computer Algorithms to Tell If You’re a Threat
Chicago police enact an algorithm to calculate a “risk score” for individuals based on factors such as criminal history and age with the aim of assessing and pre-emptively striking against risk. However, these numbers are inherently linked to human bias both in input and outcome, and could lead to unfair targeted of citizens, even as it supposedly introduces objectivity to the system.
Is the police risk score system biased, and does it improve or enhance human bias? Is it plausible to use digital technology to eliminate bias from American policing, or is this impossible? What might this look like? Does reliance on numerical data give police and tech companies more power or less power?
-
- 7 min
- Vice
- 2019
Programmer creates an application that uses neural networks to remove clothing from the images of women. Deepfake technology being used against women systematically, despite continued narrative that its use in the political realm is the most pressing issue.
- Vice
- 2019
-
- 7 min
- Vice
- 2019
This Horrifying App Undresses a Photo of Any Woman With a Single Click
Programmer creates an application that uses neural networks to remove clothing from the images of women. Deepfake technology being used against women systematically, despite continued narrative that its use in the political realm is the most pressing issue.
How does technology enhance violation of sexual privacy? Who should regulate this technology, and how?
-
- 10 min
- Quartz
- 2019
A comparison of surveillance systems in China and the US which target, and aid in the persecution of, ethnic minorities. Data on targeted people is tracked extensively and compiled into intuitive databases which can be abused by government organizations.
- Quartz
- 2019
-
- 10 min
- Quartz
- 2019
China embraces its surveillance state. The US pretends it doesn’t have one
A comparison of surveillance systems in China and the US which target, and aid in the persecution of, ethnic minorities. Data on targeted people is tracked extensively and compiled into intuitive databases which can be abused by government organizations.
In what ways are the surveillance systems of the US and China similar? Should big tech companies be allowed to contract with the government on the scale that a company like Palantir did?