AI (124)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 5 min
- Wired
- 2015
Often, gender bias is consciously or subconsciously embedded into the performance of virtual voice assistants, without considering some science surrounding linguistics or gender.
- Wired
- 2015
-
- 5 min
- Wired
- 2015
Siri and Cortana Sound Like Ladies Because of Sexism
Often, gender bias is consciously or subconsciously embedded into the performance of virtual voice assistants, without considering some science surrounding linguistics or gender.
What are the consequences of not addressing such gender bias as virtual voice assistants become more and more “human”? How has the profit motive played a role in this type of gender bias?
-
- 7 min
- Vice
- 2021
In New Orleans, a city known for its history of racist policing, grassroots activists turned to precedent from other states to ban police use of surveillance and facial recognition technology through both public and private cameras.
- Vice
- 2021
-
- 7 min
- Vice
- 2021
How Musicians and Sex Workers Beat Facial Recognition in New Orleans
In New Orleans, a city known for its history of racist policing, grassroots activists turned to precedent from other states to ban police use of surveillance and facial recognition technology through both public and private cameras.
What responsibility do firms like Palantir have to make sure that their technology is used for undeniable good? Can cities like Oakland or New Orleans become the norm in terms of privacy from facial recognition while such firms exist?
-
- 5 min
- The Guardian
- 2021
Amazon’s Ring devices are creating a private network of video surveillance that can be accessed by governments and other public entities without a warrant.
- The Guardian
- 2021
-
- 5 min
- The Guardian
- 2021
Amazon’s Ring is the largest civilian surveillance network the US has ever seen
Amazon’s Ring devices are creating a private network of video surveillance that can be accessed by governments and other public entities without a warrant.
How might home security devices impact citizenship? What are the risks of a ubiquitous deployment of home surveillance systems? How does this narrative demonstrate the compounding of human and machine biases?
-
- 7 min
- The New York Times
- 2019
Stanford team develops a neutral “Switzerland-like” alternative for systems that use human language to control computers, smartphones and internet devices in homes and offices. Known as Almond, they hope to make this software free to use on devices with specific focuses on protecting user privacy and enabling greater understanding of natural language.
- The New York Times
- 2019
-
- 7 min
- The New York Times
- 2019
Stanford Team Aims at Alexa and Siri With a Privacy-Minded Alternative
Stanford team develops a neutral “Switzerland-like” alternative for systems that use human language to control computers, smartphones and internet devices in homes and offices. Known as Almond, they hope to make this software free to use on devices with specific focuses on protecting user privacy and enabling greater understanding of natural language.
Had you heard of Almond before reading this narrative? If not, why do you think this was the case? Why might people be more willing to use the less private, corporate voice assistants than a more obscure, decentralized assistant?
-
- 7 min
- The New York Times
- 2019
ICE, along with other law enforcement agencies, mined state driver’s license databases using facial recognition tech to track down undocumented immigrants and prosecute more cases.
- The New York Times
- 2019
-
- 7 min
- The New York Times
- 2019
ICE Used Facial Recognition to Mine State Driver’s License Database
ICE, along with other law enforcement agencies, mined state driver’s license databases using facial recognition tech to track down undocumented immigrants and prosecute more cases.
What responsibility do DMVs across the country have to protect the privacy of citizens? What levels of bias (human and machine) are discussed in this story? Given that, can AI ever be unbiased in both functionality and use?
-
- 10 min
- The New York Times
- 2019
Databases of people’s faces are being compiled without their knowledge by companies and researchers (including social media companies or dating sites), with many shared around the world and fueling the advancement of facial recognition technology.
- The New York Times
- 2019
-
- 10 min
- The New York Times
- 2019
Facial Recognition Tech is Growing Stronger, Thanks to Your Face
Databases of people’s faces are being compiled without their knowledge by companies and researchers (including social media companies or dating sites), with many shared around the world and fueling the advancement of facial recognition technology.
How comfortable would you feel knowing that your face is in various databases and is being use, in some cases, to fuel their machine learning algorithms? As of right now, Google and Facebook, who are said to have the largest facial databases of all, do not share their information, but might they? And what would happen if they did?