Video and data surveillance by public and private entities.
Civil Surveillance (39)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 7 min
- The Verge
- 2020
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
- The Verge
- 2020
-
- 7 min
- The Verge
- 2020
What a machine learning tool that turns Obama white can (and can’t) tell us about AI bias
PULSE is an algorithm which can supposedly determine what a face looks like from a pixelated image. The problem: more often than not, the algorithm will return a white face, even when the person from the pixelated photograph is a person of color. The algorithm works through creating a synthetic face which matches with the pixel pattern, rather than actually clearing up the image. It is these synthetic faces that demonstrate a clear bias toward white people, demonstrating how institutional racism makes its way thoroughly into technological design. Thus, diversity in data sets will not full help until broader solutions combatting bias are enacted.
What potential harms could you see from the misapplication of the PULSE algorithm? What sorts of bias-mitigating solutions besides more diverse data sets could you envision? Based on this case study, what sorts of real-world applications should facial recognition technology be trusted with?
-
- 7 min
- New York Times
- 2018
This article details the research of Joy Buolamwini on racial bias coded into algorithms, specifically facial recognition programs. When auditing facial recognition software from several large companies such as IBM and Face++, she found that they are far worse at properly identifying darker skinned faces. Overall, this reveals that facial analysis and recognition programs are in need of exterior systems of accountability.
- New York Times
- 2018
-
- 7 min
- New York Times
- 2018
Facial Recognition Is Accurate, if You’re a White Guy
This article details the research of Joy Buolamwini on racial bias coded into algorithms, specifically facial recognition programs. When auditing facial recognition software from several large companies such as IBM and Face++, she found that they are far worse at properly identifying darker skinned faces. Overall, this reveals that facial analysis and recognition programs are in need of exterior systems of accountability.
What does exterior accountability for facial recognition software look like, and what should it look like? How and why does racial bias get coded into technology, whether explicitly or implicitly?
-
- 3 min
- MacRumors
- 2021
Facebook’s collaboration with Ray-Ban on a new technology of “smart glasses” comes with a host of questions on whether or not capabilities such as facial recognition should be built into the technology.
- MacRumors
- 2021
-
- 3 min
- MacRumors
- 2021
Facebook Weighing Up Legality of Facial Recognition in Upcoming Smart Glasses
Facebook’s collaboration with Ray-Ban on a new technology of “smart glasses” comes with a host of questions on whether or not capabilities such as facial recognition should be built into the technology.
What are the “so clear” benefits and risks of having facial recognition algorithms implanted into smart glasses, in your view? What are the problems with “transparent technology” such as smart glasses, where other citizens may not even know that they are being surveilled?
-
- 3 min
- techviral
- 2018
In India, where disappearance of children is a common social issue, facial recognition technology has been useful in identifying and located many missing or displaced children. This breakthrough means that the technology can hopefully be applied to help ameliorate this issue, as well as in other areas such as law enforcement.
- techviral
- 2018
-
- 3 min
- techviral
- 2018
New Facial Recognition System Helps Trace 3000 Missing Children In Just 4 Days
In India, where disappearance of children is a common social issue, facial recognition technology has been useful in identifying and located many missing or displaced children. This breakthrough means that the technology can hopefully be applied to help ameliorate this issue, as well as in other areas such as law enforcement.
In what ways does this specific technology serve the common good in India? What are the concerns about the privacy of the children involved, and is this outweighed by the value of safety? To what degree does facial recognition technology actually help solve this problem in general?
-
- 7 min
- Kinolab
- 2008
Under threat of eviction, Luz must find a quick way to make some money to pay rent. Thankfully, through the company TruNode, she can digitize her memories and sell them on the internet for anyone who may wish to access and stream them. While this seems convenient, the downsides are shown when the repository of her memories are used to help ruthless drone pilot Rudy Ramirez hunt down an innocent laborer who is a supposedly dangerous criminal. After Luz reveals this means of making money to Memo, the aforementioned innocent laborer, he is less than enthused with the system.
- Kinolab
- 2008
Selling Digitized Memories
Under threat of eviction, Luz must find a quick way to make some money to pay rent. Thankfully, through the company TruNode, she can digitize her memories and sell them on the internet for anyone who may wish to access and stream them. While this seems convenient, the downsides are shown when the repository of her memories are used to help ruthless drone pilot Rudy Ramirez hunt down an innocent laborer who is a supposedly dangerous criminal. After Luz reveals this means of making money to Memo, the aforementioned innocent laborer, he is less than enthused with the system.
How can the high cost of very personal data and digital memories be both empowering in the right circumstances and disempowering in the wrong ones? What if people were able to sell all of their personal data, as is shown here? Is the complete digitization of memory a positive concept or a negative one? How can data or memory be purchased for nefarious purposes? How can people be unintentionally harmed by this system? Can the emotions of memories ever be paired well with a digital interface?
-
- 5 min
- Gizmodo
- 2021
Customs and Border protection used facial recognition technology to scan travelers entering the U.S at several points of entry in 2020, and did not identify any impostors or impersonators. This is part of a larger program of using biometrics to screen those who enter the country, which raises concerns about data privacy, who may have access to this data, and how it may be used.
- Gizmodo
- 2021
-
- 5 min
- Gizmodo
- 2021
CBP Facial Recognition Scanners Failed to Find a Single Imposter At Airports in 2020
Customs and Border protection used facial recognition technology to scan travelers entering the U.S at several points of entry in 2020, and did not identify any impostors or impersonators. This is part of a larger program of using biometrics to screen those who enter the country, which raises concerns about data privacy, who may have access to this data, and how it may be used.
What bad outcomes are possible from the government having extensive biometric data, including facial scans, on many people who try to enter the country? Why does the government get away with using biased technology to conduct facial scans at airports, for example? Are “facilitation improvements” worth aiming for if it means using technologies that are not 100% effective and will disproportionately harm certain populations?