AI (124)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- TED
- 2017
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
- TED
- 2017
-
- 7 min
- TED
- 2017
Justice in the Age of Big Data
Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.
Should we leave policing to algorithms? Can any “perfect” algorithm for policing be created? How can police departments and software companies be held accountable for masquerading bias as the objectivity of an algorithm?
-
- 5 min
- The New York Times
- 2019
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
- The New York Times
- 2019
-
- 5 min
- The New York Times
- 2019
How Biometrics Makes You Safer
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
How does facial recognition technology facilitate challenging investigations? Do you believe police use of facial recognition is as transparent and pure as this article makes it seem? Where could bias enter this system of using facial recognition technology?
-
- 5 min
- Wired
- 2019
Non-profit companies such as Thorn and the Canadian Centre for Child Protection are using existing software, particularly facial recognition algorithms, to discover ways to become more proactive in fighting child pornography and human trafficking on the dark web.
- Wired
- 2019
-
- 5 min
- Wired
- 2019
How Facial Recognition is fighting child sex trafficking
Non-profit companies such as Thorn and the Canadian Centre for Child Protection are using existing software, particularly facial recognition algorithms, to discover ways to become more proactive in fighting child pornography and human trafficking on the dark web.
How has technology facilitated underground illegal activities, such as child trafficking? How has technology also facilitated fighting back against them? What is your opinion on the debate on whether or not law enforcement should have extensive access to facial recognition technology or machine learning algorithms?
-
- 7 min
- MIT Technology Review
- 2019
Autonomous vehicles could be subject to hacks by adversarial machine-learning, possibly perpetrated by out-of-work truck/Uber drivers and “adversarial machine learning”. The fact that vehicle algorithms can already be fairly easily tricked also raises concerns.
- MIT Technology Review
- 2019
-
- 7 min
- MIT Technology Review
- 2019
Hackers Are the Real Obstacle for Self-Driving Vehicles
Autonomous vehicles could be subject to hacks by adversarial machine-learning, possibly perpetrated by out-of-work truck/Uber drivers and “adversarial machine learning”. The fact that vehicle algorithms can already be fairly easily tricked also raises concerns.
Had you considered this big obstacle in self-driving? How would this risk impact the business of self-driving vehicles? What are the consequences of companies not fully understanding the machine algorithms that they use? Should we use self-driving vehicles when this threat stands?
-
- 2 min
- Wired
- 2019
Synthetic human faces are able to be generated by a machine learning algorithm.
- Wired
- 2019
-
- 2 min
- Wired
- 2019
Artificial Intelligence Is Coming For Our Faces
Synthetic human faces are able to be generated by a machine learning algorithm.
What are some consequences to AI being able to render fake yet believable human faces?
-
- 7 min
- Vice
- 2019
An academic perspective on an algorithm created by PredPol to “predict crime.” Unless every single crime is reported, and unless and police pursue all types of crimes committed by all people equally, it’s impossible to have a reinforcement learning system that predicts crime itself.Rather, police find crimes in the same places they’ve been told to look for them, feeding the algorithm ineffective data and allowing unjust targeting of communities of color by the police to continue based on trust in the algorithm.
- Vice
- 2019
-
- 7 min
- Vice
- 2019
Academics Confirm Major Predictive Policing Algorithm is Fundamentally Flawed
An academic perspective on an algorithm created by PredPol to “predict crime.” Unless every single crime is reported, and unless and police pursue all types of crimes committed by all people equally, it’s impossible to have a reinforcement learning system that predicts crime itself.Rather, police find crimes in the same places they’ve been told to look for them, feeding the algorithm ineffective data and allowing unjust targeting of communities of color by the police to continue based on trust in the algorithm.
Can an algorithm which claims to predict crime ever be fair? Is it ever justified for volatile actors such as police to act based on directions from a machine, where the logic is not always transparent?