Additional Resources
Facial recognition technologies are powerful tools that have come to dominate many applications, often serving as ‘gate keepers’ for accessing services, social spaces, and security protocols. They are embedded into many societal surveillance programs (private corporation and governmental) without general public knowledge. It is only recently that legal and ethical guidelines are being created about the ways in which they can be used responsibly, how their training datasets are collected and their quality, and the ways in which their biases may be harming some communities or the larger public. Responsible computer scientists and product development teams should be aware of their responsibilities to ensure fairness, accountability, and transparency when creating applications that use facial recognition technologies.
The goal is to learn an overview facial recognition technology and consider multiple perspectives on potential social or ethical issues surrounding its use.
In this module we will read about different perspectives on facial recognition through text narratives and participate in an activity simulating the implementation of facial recognition on a college campus.
Please complete the following before we meet:
-
Answer this short concept check, which takes approximately 10 minutes. Then, read and watch the following narratives:
-
- 5 min
- CNN
- 2010
Close btn-
- 5 min
- CNN
- 2010
Why face recognition isn’t scary — yet
Algorithms and machines can struggle with facial recognition, and need ideal source images to perform it consistently. However, its potential use in monitoring and identifying citizens is concerning.
Discussion Prompts:How have the worries regarding facial recognition changed since 2010? Can we teach machines to identify human faces? How can facial recognition pose a danger/worry when use for governmental purposes?
Algorithms and machines can struggle with facial recognition, and need ideal source images to perform it consistently. However, its potential use in monitoring and identifying citizens is concerning.
-
- 3 min
- CNBC
- 2013
Close btn-
- 3 min
- CNBC
- 2013
How Facial Recognition Technology Could Help Catch Criminals
Facial recognition software, or using computer vision and biometric technology on an image of a person to identify them, has potential applications in law enforcement to help catch suspects or criminals. However, aspects of probability are at play, especially as the photos or videos captured become blurrier and need an additional layer of software analysis to be “de-pixelized.” Also, identification depends on the databases to which the FBI has access.
Discussion Prompts:How should law enforcement balance training these facial recognition programs with good amounts of quality data and avoiding breaching privacy by accessing more databases with citizen faces? Where can human bias enter into the human-computer systems described in the article? Should there be any margin of error or aspect of probability in technologies that work in volatile areas like law enforcement?
Facial recognition software, or using computer vision and biometric technology on an image of a person to identify them, has potential applications in law enforcement to help catch suspects or criminals. However, aspects of probability are at play, especially as the photos or videos captured become blurrier and need an additional layer of software analysis to be “de-pixelized.” Also, identification depends on the databases to which the FBI has access.
-
- 8 min
- Kinolab
- 2016
Close btnLacie Part I: Translating Online Interactions and Social Quantification
In a world in which social media is constantly visible, and in which the averaged five star rating for each person based on every single one of their interactions with others are displayed, Lacie tries to move into the higher echelons of society. She does this by consistently keeping up saccharine appearances in real life and on her social media feed because everyone is constantly connected to this technology. Once she is spurred to up her rating, Lacie gets an invite to a high-profile wedding. However, after a few unfortunate events leave her seeming less desirable to others, thus lowering her rating, she finds her world far less accessible and kind. For further reading and real-life connections, see the narrative “Inside China’s Vast New Experiment in Social Ranking.”
Discussion Prompts:How do digital platforms promote inauthenticity? Why do appearances matter more in the digital age? Can digital technologies ever truly perfectly mirror an in-person interaction? Do the shallower ways in which people communicate online translate well into the real world? How could digital social platforms do better at promoting longer connection instead of the instant gratification of likes or ratings? Should social media platforms be so focused on quantifying interactions, in terms of likes or comments or followers? How can this quantification be de-emphasized?
In a world in which social media is constantly visible, and in which the averaged five star rating for each person based on every single one of their interactions with others are displayed, Lacie tries to move into the higher echelons of society. She does this by consistently keeping up saccharine appearances in real life and on her social media feed because everyone is constantly connected to this technology. Once she is spurred to up her rating, Lacie gets an invite to a high-profile wedding. However, after a few unfortunate events leave her seeming less desirable to others, thus lowering her rating, she finds her world far less accessible and kind. For further reading and real-life connections, see the narrative “Inside China’s Vast New Experiment in Social Ranking.”
-
In five groups, please read the following narratives:
-
Group 1:
-
- 2 min
- azfamily.com
- 2018
Close btn-
- 2 min
- azfamily.com
- 2018
Facial recognition technology now used in Phoenix area to locate lost dogs
Facial recognition technology has found a new application: reuniting dogs with their owners. A simple machine learning algorithm takes a photo of a dog and crawls through a database of photos of dogs in shelters in hopes of finding a match.
Discussion Prompts:How could this beneficial use of recognition technology find even broader use?
Facial recognition technology has found a new application: reuniting dogs with their owners. A simple machine learning algorithm takes a photo of a dog and crawls through a database of photos of dogs in shelters in hopes of finding a match.
-
- 5 min
- Silicon Angle
- 2019
Close btn-
- 5 min
- Silicon Angle
- 2019
Empathic AI mirrors human emotions to help autistic children
Artificial Companions assist developmentally disabled kids based on the principle that humans can indeed form emotional connections with nonhuman objects. In fact, it is not exceedingly difficult for robots to read or mirror human emotions, which could have positive implications in workplace or educational settings.
Discussion Prompts:Is it possible to develop an emotional connection with a robotic companion? How can robotic company improve behaviour? How does it compare to human company?
Artificial Companions assist developmentally disabled kids based on the principle that humans can indeed form emotional connections with nonhuman objects. In fact, it is not exceedingly difficult for robots to read or mirror human emotions, which could have positive implications in workplace or educational settings.
-
- 5 min
- BBC
- 2021
Close btn-
- 5 min
- BBC
- 2021
Facial recognition technology meant mum saw dying son
The ability of facial recognition technology used by the South Wales Police force to identify an individual based on biometric data nearly instantly rather than the previous standard of 10 days allowed a mother to say goodbye to her son on his deathbed. It seems to have other positive impacts, such as identifying criminals earlier than they otherwise might have been. However, as is usually the case, concerns abound about how this facial recognition technology can violate human rights.
Discussion Prompts:Who can be trusted with facial recognition algorithms that can give someone several possibilities for the identity of a particular face? Who can be trusted to decide in what cases this technology can be deployed? How can bias become problematic when a human is selecting one of many faces recommended by the algorithm? Should the idea of constant surveillance or omnipresent cameras make us feel safe or concerned?
The ability of facial recognition technology used by the South Wales Police force to identify an individual based on biometric data nearly instantly rather than the previous standard of 10 days allowed a mother to say goodbye to her son on his deathbed. It seems to have other positive impacts, such as identifying criminals earlier than they otherwise might have been. However, as is usually the case, concerns abound about how this facial recognition technology can violate human rights.
-
-
Group 2:
-
- 3 min
- techviral
- 2018
Close btn-
- 3 min
- techviral
- 2018
New Facial Recognition System Helps Trace 3000 Missing Children In Just 4 Days
In India, where disappearance of children is a common social issue, facial recognition technology has been useful in identifying and located many missing or displaced children. This breakthrough means that the technology can hopefully be applied to help ameliorate this issue, as well as in other areas such as law enforcement.
Discussion Prompts:In what ways does this specific technology serve the common good in India? What are the concerns about the privacy of the children involved, and is this outweighed by the value of safety? To what degree does facial recognition technology actually help solve this problem in general?
In India, where disappearance of children is a common social issue, facial recognition technology has been useful in identifying and located many missing or displaced children. This breakthrough means that the technology can hopefully be applied to help ameliorate this issue, as well as in other areas such as law enforcement.
-
- 5 min
- The New York Times
- 2019
Close btn-
- 5 min
- The New York Times
- 2019
How Biometrics Makes You Safer
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
Discussion Prompts:How does facial recognition technology facilitate challenging investigations? Do you believe police use of facial recognition is as transparent and pure as this article makes it seem? Where could bias enter this system of using facial recognition technology?
In New York City, biometrics were used as a step in the investigation process, and thus combined with human oversight to help identify criminals and victims alike.
-
- 5 min
- New York Times
- 2020
Close btn-
- 5 min
- New York Times
- 2020
A Case for Facial Recognition
Decisions on whether or not law enforcement should be trusted with facial recognition are tricky, as is argued by Detroit city official James Tate. On one hand, the combination of the bias latent in the technology itself and the human bias of those who use it sometimes leads to over-policing of certain communities. On the other hand, with the correct guardrails, it can be an effective tool in getting justice in cases of violent crime. This article details the ongoing debate about how much facial recognition technology use is proper in Detroit.
Discussion Prompts:Who should be deciding on the guardrails surrounding the use of facial recognition technology? How can citizens have more control over when their face is being recorded or captured? Can there ever be enough guardrails to truly ensure that facial recognition technology can be used with no chance of bias?
Decisions on whether or not law enforcement should be trusted with facial recognition are tricky, as is argued by Detroit city official James Tate. On one hand, the combination of the bias latent in the technology itself and the human bias of those who use it sometimes leads to over-policing of certain communities. On the other hand, with the correct guardrails, it can be an effective tool in getting justice in cases of violent crime. This article details the ongoing debate about how much facial recognition technology use is proper in Detroit.
-
-
Group 3:
-
- 7 min
- Slate
- 2019
Close btn-
- 7 min
- Slate
- 2019
Facebook’s Face-ID Database Could Be the Biggest in the World. Yes, It Should Worry Us.
Discussion of Facebook’s massive collection of human faces and their potential impact on society.
Discussion Prompts:Is Facebook’s facial recognition database benign, or a slow-bubbling volcano?
Discussion of Facebook’s massive collection of human faces and their potential impact on society.
-
- 40 min
- New York Times Magazine
- 2021
Close btn-
- 40 min
- New York Times Magazine
- 2021
Your Face Is Not Your Own
This article goes into extraordinary detail on the company Clearview AI, a company whose algorithm has crawled the public web to provide over 3 billion photos of faces with links that travel to the original source of each photo. Discusses the legality and privacy concerns of this technology, how the technology has already been used by law enforcement and in court cases, and the founding of the company. Private use of technology similar to that of Clearview AI could revolutionize society and may move us to the post-privacy era.
Discussion Prompts:Should companies like Clearview AI exist? How would facial recognition be misused by both authorities and the general public if it were to permeate all aspects of life?
This article goes into extraordinary detail on the company Clearview AI, a company whose algorithm has crawled the public web to provide over 3 billion photos of faces with links that travel to the original source of each photo. Discusses the legality and privacy concerns of this technology, how the technology has already been used by law enforcement and in court cases, and the founding of the company. Private use of technology similar to that of Clearview AI could revolutionize society and may move us to the post-privacy era.
-
- 40 min
- New York Times
- 2021
Close btnShe’s Taking Jeff Bezos to Task
As facial recognition technology becomes more prominent in everyday life, used by players such as law enforcement officials and private actors to identify faces by comparing them with databases, AI ethicists/experts such as Joy Buolamwini push back against the many forms of bias that these technologies show, specifically racial and gender bias. Governments often use such technologies callously or irresponsibly, and lack of regulation on the private companies which sell these products could lead society into a post-privacy era.
Discussion Prompts:Do you envision an FDA-style approach to technology regulation, particularly for facial recognition, being effective? Can large tech companies be incentivized to make truly ethical decisions on how their technology is created or deployed as long as the profit motive exists? What would this look like? What changes to the technology workforces, such as who designs software products or who chooses data sets, need to be made for technology’s impact to become more equal across populations?
Movie Title:As facial recognition technology becomes more prominent in everyday life, used by players such as law enforcement officials and private actors to identify faces by comparing them with databases, AI ethicists/experts such as Joy Buolamwini push back against the many forms of bias that these technologies show, specifically racial and gender bias. Governments often use such technologies callously or irresponsibly, and lack of regulation on the private companies which sell these products could lead society into a post-privacy era.
-
-
Group 4:
-
- 5 min
- CNET
- 2019
Close btn-
- 5 min
- CNET
- 2019
Demonstrators scan public faces in DC to show lack of facial recognition laws
Fight for the Future, a digital activist group, used Amazon’s Rekognition facial recognition software to scan faces on the street in Washington DC to show that there should be more guardrails on the use of this type of technology, before it is deployed for ends which violate human rights such as identifying peaceful protestors.
Discussion Prompts:Does this kind of stunt seem effective at getting the attention of the public on the ways that facial recognition can be misused? How? Who decides what is a “positive” use of facial recognition technology, and how can these use cases be negotiated with those citizens who want their privacy protected?
Fight for the Future, a digital activist group, used Amazon’s Rekognition facial recognition software to scan faces on the street in Washington DC to show that there should be more guardrails on the use of this type of technology, before it is deployed for ends which violate human rights such as identifying peaceful protestors.
-
- 7 min
- Amnesty International
- 2021
Close btn-
- 7 min
- Amnesty International
- 2021
AMNESTY INTERNATIONAL CALLS FOR BAN ON THE USE OF FACIAL RECOGNITION TECHNOLOGY FOR MASS SURVEILLANCE
Amnesty International released a statement detailing its opposition of widespread use of facial recognition technology for mass surveillance purposes based on its misuse and unfair impacts over Black communities and the chilling effect which it would create on peaceful protest.
Discussion Prompts:Is more accurate facial recognition technology a good thing or a bad thing? How would FRT be weaponized to justify policing policies that are already unfair toward Black communities? Why is anonymity important, both in protest scenarios and elsewhere? Can anyone be anonymous in the age of digital technology? What amount of anonymity is appropriate?
Amnesty International released a statement detailing its opposition of widespread use of facial recognition technology for mass surveillance purposes based on its misuse and unfair impacts over Black communities and the chilling effect which it would create on peaceful protest.
-
- 7 min
- Slate
- 2021
Close btn-
- 7 min
- Slate
- 2021
Maine Now Has the Toughest Facial Recognition Restrictions in the U.S.
A new law passed unanimously in Maine heavily restricts the contexts in which facial recognition technology can be deployed, putting significant guardrails around how it is used by law enforcement. Also, it allows citizens to sue if they believe the technology has been misused. This is a unique step in a time when several levels of government, all the way up to the federal government, are less likely to attach strict rules to the use of facial recognition technology, despite the clear bias that is seen in the wake of its use.
Discussion Prompts:How can tech companies do even more to lobby for stricter facial recognition regulation? Is a moratorium on facial recognition use by all levels of government the best plan? Why or why not? Does creating “more diverse datasets” truly solve all the problems of bias with the technology?
A new law passed unanimously in Maine heavily restricts the contexts in which facial recognition technology can be deployed, putting significant guardrails around how it is used by law enforcement. Also, it allows citizens to sue if they believe the technology has been misused. This is a unique step in a time when several levels of government, all the way up to the federal government, are less likely to attach strict rules to the use of facial recognition technology, despite the clear bias that is seen in the wake of its use.
-
-
Group 5:
-
- 5 min
- The Guardian
- 2019
Close btn-
- 5 min
- The Guardian
- 2019
New York tenants fight as landlords embrace Biometrics cameras
Biometrics technology will be implemented as a means of gaining access to a residential building in Brooklyn, causing pushback among the tenants who prefer to keep their data private, especially considering the lack of legal regulation surrounding the technology. Specifically, there is growing fear that the facial recognition database could be sold to or abused by law enforcement.
Discussion Prompts:How have biometrics changed the landscape and ideology of the security industry? How does this story fit in with other information or narratives you have read about the use of facial recognition?
Biometrics technology will be implemented as a means of gaining access to a residential building in Brooklyn, causing pushback among the tenants who prefer to keep their data private, especially considering the lack of legal regulation surrounding the technology. Specifically, there is growing fear that the facial recognition database could be sold to or abused by law enforcement.
-
- 10 min
- New York Times
- 2019
Close btn-
- 10 min
- New York Times
- 2019
As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
Discussion Prompts:What are the consequences of employing biased technologies to survey citizens? Who loses agency, and who gains agency?
Racial bias in facial recognition software used for Government Civil Surveillance in Detroit. Racially biased technology. Diminishes agency of minority groups and enhances latent human bias.
-
- 5 min
- Gizmodo
- 2021
Close btn-
- 5 min
- Gizmodo
- 2021
CBP Facial Recognition Scanners Failed to Find a Single Imposter At Airports in 2020
Customs and Border protection used facial recognition technology to scan travelers entering the U.S at several points of entry in 2020, and did not identify any impostors or impersonators. This is part of a larger program of using biometrics to screen those who enter the country, which raises concerns about data privacy, who may have access to this data, and how it may be used.
Discussion Prompts:What bad outcomes are possible from the government having extensive biometric data, including facial scans, on many people who try to enter the country? Why does the government get away with using biased technology to conduct facial scans at airports, for example? Are “facilitation improvements” worth aiming for if it means using technologies that are not 100% effective and will disproportionately harm certain populations?
Customs and Border protection used facial recognition technology to scan travelers entering the U.S at several points of entry in 2020, and did not identify any impostors or impersonators. This is part of a larger program of using biometrics to screen those who enter the country, which raises concerns about data privacy, who may have access to this data, and how it may be used.
-
In two groups, please participate in the taskforce simulation:
-
You will now participate in a taskforce simulation. As a group, list all of the benefits or negative consequences you can think of if a college were to implement a facial recognition system to manage its campus life programs. Think of these potential benefits or harms from the perspectives of students, faculty, administrators, alumni, and parents.
Things to talk about in your group:
-
What kinds of values and ethical issues were reflected in the narratives that you read in your small group?
-
How did the narratives you read earlier change or shape your feelings about implementing facial recognition in a college/university setting?
-
What recommendations should we provide considering our role of computer science students? What are the resources that are available to us to consider as a part of the expert review process?
For example:
(Hint: see Post-Activity for more of these resources.)
-
Please complete the following after we meet:
-
Choose one of the following narratives to read. Ask yourself while you are reading:
-
How do these guidelines reflect current values and concerns we have read/talked about so far?
-
Do they talk about whose responsibility it is to make sure apps using facial recognition technologies are created and used in an ethical manner?
-
What is the motivation (explicit/implicit) for this narrative and who is its target audience?
Narratives:
- Amazon’s Guidelines on Facial Recognition Technologies (2020)
- Google: Our Approach to Facial Recognition Technologies (2020)
- Association for Computing Machinery’s U.S. Technology Policy Committee (USTPC) PRINCIPLES AND PREREQUISITES FOR THE DEVELOPMENT, EVALUATION AND USE OF UNBIASED FACIAL RECOGNITION TECHNOLOGIES (2020)
- National Telecomm and Information Administration An ethical framework for facial recognition (2020)
- The Library of the Future Facial Recognition (2019)
-
-
Please complete this short concept check. Yes, it is the same questions as the one you did earlier.
-
What is our role and our guiding responsibilities for developing ethical technologies as computer science students?
Here are some resources that should be guiding any product development lifecycle (design, development, deployment, evaluation).
-
The goal of this module is to provide students with an overview of facial recognition technology and and the ethical issues surrounding its use. The module is designed for introductory to intermediate CS courses (i.e., Intro to CS through Algorithms). It follows the CEN format asking students to consider their preexisting ideas and knowledge about the technology (Pre-Activity), consider the history of who created the technology, its original/intended purpose, the impact of its widespread applications (Activity 1), reflect on society level impact from multiple perspectives (technology creator, campus community member, person in society) (Activity 2), and assess existing ethical guidelines from technology organizations (Post-Activity).
-
Students will be able to identify a technology, who designed it, and the purpose it was designed for.
-
Students will be able to identify who the technology was not intended for and who it potentially may harm.
-
Students will be able to articulate how the technology has changed over time and the different purpose it now serves.
-
Students will be able to articulate and discuss possible benefits (such as increased security and ease of identification) and issues (such as harvesting of personal information) with an increase in facial recognition technology.
We have provided examples of assessments you may want to copy and use on your own LMS for secure student data collection. The rationale, format, and time length for the module components are listed below.
Pre-Activity is an online pre-module assignment asking students to respond to a series of questions about what they know about this technology and viewing provided links to narratives.
Activity 1 is a class activity (0.5 hours) that places students into small groups to read and discuss a set of provided narratives to inform them about certain perspectives on facial recognition technology.
Activity 2 is a class activity (1.5 hours) that provides a brief history of facial recognition technology and then divides class into two groups (n=~24 or more with larger class/online class) to evaluate the positive and negative impact of facial recognition technology as it would be applied to their college campus community and services. Groups are given specific roles and perspectives to portray as a part of the role play activity. Homework is assigned to provide additional narratives to consider after the class activity.
Post-Activity is an online post-module assignment that asks students to respond again to a series of questions about what they now know about this technology based on the narratives they viewed or read and the Activity 1 or 2 discussions.
-
- 3 min
Coded Bias: How Ignorance Enters Computer Vision
A brief visual example of an application of computer vision for facial recognition, how these algorithms can be trained to recognized faces, and the dangers that come with biased data sets, such as a disproportionate amount of white men.
When thinking about computer vision in relation to projects such as the Aspire Mirror, what sorts of individual and systemic consequences arise for those who have faces that biased computer vision programs do not easily recognize?
A brief visual example of an application of computer vision for facial recognition, how these algorithms can be trained to recognized faces, and the dangers that come with biased data sets, such as a disproportionate amount of white men.
- Vimeo: Shalini Kantayya
- 2020