Fairness and Non-discrimination (56)

View options:

Find narratives by ethical themes or by technologies.

FILTERreset filters
Themes
  • Privacy
  • Accountability
  • Transparency and Explainability
  • Human Control of Technology
  • Professional Responsibility
  • Promotion of Human Values
  • Fairness and Non-discrimination
Show more themes
Technologies
  • AI
  • Big Data
  • Bioinformatics
  • Blockchain
  • Immersive Technology
Show more technologies
Additional Filters:
  • Media Type
  • Availability
  • Year
    • 1916 - 1966
    • 1968 - 2018
    • 2019 - 2069
  • Duration
  • 27 min
  • Cornell Tech
  • 2019
image description
Quantifying Workers

Podcast about worker quantification in factors such as hiring, productivity and more. Dives into the discussion on why we should attempt a fair making of algorithms. Warns specifically about how algorithms can find “proxy variables” to approximate for cultural fits like race or gender even when the algorithms is supposedly controlled for these factors.

  • Cornell Tech
  • 2019
  • 28 min
  • Cornell Tech
  • 2019
image description
Algorithms in the Courtroom

Pre-trial risk assessment is part of an attempted answer to mass incarceration. Data sometimes answers a different question than the ones we’re trying to answer (data based on riskiness before incarceration, not how dangerous they are later). Essentially, technologies and algorithms which end up in contexts of social power differentials can often be abused to further cause injustice against people accused of a crime, for example. Numbers are not neutral and can even be a “moral anesthetic,” especially if the sampled data has confounding variables that collectors ignore. Engineers designing technology do not always envisage ethical questions when making decisions that ought to be political.

  • Cornell Tech
  • 2019
  • 27 min
  • Cornell Tech
  • 2019
image description
Teaching Ethics in Data Science

Solon Barocas discusses his relatively new course on ethics in data science, following a larger trend of developing ethical sensibility in this field. He shares ideas of spreading out lessons across courses, promoting dialogue, and making sure we are really analyzing problems while learning to stand up for the right thing. Offers a case study of technological ethical sensibilities through questions raised by predictive policing algorithms.

  • Cornell Tech
  • 2019
  • 7 min
  • TED
  • 2017
image description
Justice in the Age of Big Data

Predictive policing software such as PredPol may claim to be objective through mathematical, “colorblind” analyses of geographical crime areas, yet this supposed objectivity is not free of human bias and is in fact used as a justification for the further targeting of oppressed groups, such as poor communities or racial and ethnic minorities. Further, the balance between fairness and efficacy in the justice system must be considered, since algorithms tend more toward the latter than the former.

  • TED
  • 2017
  • 10 min
  • Survival of the Best Fit
  • 2018
image description
Survival of the Best Fit

Explores hiring bias of AI by playing a game in which you are the hiring manager.

  • Survival of the Best Fit
  • 2018
  • 5 min
  • MIT Technology Review
  • 2019
image description
Police across the US are training crime-predicting AIs on falsified data

In the case of the New Orleans Police Department, along with other cities, data used to train predictive crime algorithms was inconsistent and “dirty” to begin with, making the results disproportionately targeted toward disadvantaged communities.

  • MIT Technology Review
  • 2019
Load more