All Narratives (328)

View options:

Find narratives by ethical themes or by technologies.

FILTERreset filters
Themes
  • Privacy
  • Accountability
  • Transparency and Explainability
  • Human Control of Technology
  • Professional Responsibility
  • Promotion of Human Values
  • Fairness and Non-discrimination
Show more themes
Technologies
  • AI
  • Big Data
  • Bioinformatics
  • Blockchain
  • Immersive Technology
Show more technologies
Additional Filters:
  • Media Type
  • Availability
  • Year
    • 1916 - 1966
    • 1968 - 2018
    • 2019 - 2069
  • Duration
  • 5 min
  • Premium Beat
  • 2020
image description
Is Deepfake Technology the Future of the Film Industry?

This blog post explores what a combination of deepfake and computer generated images (CGI) technologies might mean to film makers.

  • Premium Beat
  • 2020
  • 5 min
  • MIT Technology Review
  • 2020
image description
Inside the strange new world of being a deepfake actor

This article details the reactions to the deepfake documentary In the event of moon disaster.

  • MIT Technology Review
  • 2020
  • 33 min
  • Scientific American
  • 2020
image description
To Make a DeepFake

In conjunction with Scientific American, this thirty minute documentary brings the film In Event of Moon Disaster to a group of experts on AI, digital privacy, law, and human rights to gauge their reaction on the film, and to provide context on this new technology—its perils, its potential, and the possibilities of this brave, new digital world, where every pixel that moves past our collective eyes is potentially up for grabs.

  • Scientific American
  • 2020
  • 10 min
  • MIT Media Labs
  • 2021
image description
Detect DeepFakes

This is an MIT research project. All data for research is collected anonymously for research purposes.
This project will show you a variety of media snippets including transcripts, audio files, and videos. Sometimes, we include subtitles. Sometimes, the video is silent. You can watch the videos as many times as you would like. The site will ask you to share how confident you are that the individual really said what we show. If you have seen the video before today, please select the checkbox that says “I’ve already seen this video.” And remember, half of the media snippets that are presented are statements that the individual actually said. Read more about this project and the dataset used to produce this research on the About page
https://detectfakes.media.mit.edu/about

  • MIT Media Labs
  • 2021
  • 13 min
  • Danielle Citron
  • 2019
image description
How deepfakes undermine truth and threaten democracy

The use of deepfake technology to manipulate video and audio for malicious purposes — whether it’s to stoke violence or defame politicians and journalists — is becoming a real threat. As these tools become more accessible and their products more realistic, how will they shape what we believe about the world? In a portentous talk, law professor Danielle Citron reveals how deepfakes magnify our distrust — and suggests approaches to safeguarding the truth.
Discussion Questions:

What are some of the possible uses for video and audio deepfakes?
What is trust? How do we normally verify information we receive?
How does this type of technology erode trust in existing systems of accountability in society?
Can you think of any possible positive uses of this technology?

  • Danielle Citron
  • 2019
Load more