Mass manipulation of public opinion through social media and other digital media
Digital Media Misinformation (16)
Find narratives by ethical themes or by technologies.
FILTERreset filters-
- 6 min
- Wired
- 2019
Spreading of harmful content through Youtube’s AI recommendation engine algorithm. AI helps create filter bubbles and echo chambers. Limited user agency to be exposed to certain content.
- Wired
- 2019
-
- 6 min
- Wired
- 2019
The Toxic Potential of YouTube’s Feedback Loop
Spreading of harmful content through Youtube’s AI recommendation engine algorithm. AI helps create filter bubbles and echo chambers. Limited user agency to be exposed to certain content.
How much agency do we have over the content we are shown in our digital artifacts? Who decides this? How skeptical should we be of recommender systems?
-
- 7 min
- New York Times
- 2018
Youtube’s algorithm suggests increasingly radical recommendations to its users, maximising the amount of time they spend on the platform. The tendency toward inflammatory recommendations often leads to political misinformation.
- New York Times
- 2018
-
- 7 min
- New York Times
- 2018
Youtube, The Great Radicalizer
Youtube’s algorithm suggests increasingly radical recommendations to its users, maximising the amount of time they spend on the platform. The tendency toward inflammatory recommendations often leads to political misinformation.
What are the dangers of being offered increasingly radical videos on Youtube?
-
- 15 min
- n/a
- 2018
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
- n/a
- 2018
-
- 15 min
- n/a
- 2018
Choose Your Own Fake News
Choose-your-own-adventure game, in which you experience some sort of data fraud through acting in the position of a cast of characters.
How can you be less vulnerable to fake news and fake advertising online?
-
- 10 min
- n/a
- 2018
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
- n/a
- 2018
-
- 10 min
- n/a
- 2018
In Event of Moon Disaster
Techniques of misinformation are used to make a film about an alternative history in which the Apollo 11 mission failed and the astronauts became stranded on the moon.
How do you foresee this type of misinformation being related to conspiracy theories? Do you believe you could have spotted the deepfake were you not specifically looking out for it? Are we approaching a future where we may have to watch all media with such scrutiny?
-
- 11 min
- Kinolab
- 2013
A CGI bear named Waldo is created using computer technology which sees the facial expressions of a comedian and renders it in real-time onto a screen. He is able to insult politicians with little retribution perhaps in part because he does not appear human. This power is harnessed by executives to put up Waldo as a candidate in a political race, where he is able to take part in a debate with real people and does not seem beholden to the same standards. Eventually, Waldo’s “driver” Jamie reveals his own identity, but Waldo continues on as a figure through embodying the voice of another worker in the company.
- Kinolab
- 2013
Politics and Digital Mouthpieces
A CGI bear named Waldo is created using computer technology which sees the facial expressions of a comedian and renders it in real-time onto a screen. He is able to insult politicians with little retribution perhaps in part because he does not appear human. This power is harnessed by executives to put up Waldo as a candidate in a political race, where he is able to take part in a debate with real people and does not seem beholden to the same standards. Eventually, Waldo’s “driver” Jamie reveals his own identity, but Waldo continues on as a figure through embodying the voice of another worker in the company.
How do digital media, specifically social media platforms, allow critical and political voices to hide behind some wall of anonymity? Can digital abstractions of real political figures be considered to actually fully embody the person or candidate themself, especially when other staffers usually run their accounts? How do digital platforms change the nature of relationships between politicians and citizens in terms of direct communication? Does hiding behind digital platforms make it easier for anyone to make bold claims and statements?
-
- 8 min
- Kinolab
- 2016
In this extreme imagination of social media, detectives Karin Parke and Blue Coulson try to discover the correlation between two recent deaths. They first interrogate a teacher who posted “#DeathTo @JoPowersWriter” along with a photo of controversial journalist Jo Powers on the day before Jo was found dead. The teacher discusses the popularity of this message and the hashtag, sharing that an entire online community split the cost of sending Jo a hateful message on a cake. Later on, the detectives discover that these deaths were determined by bots and the trending of the #DeathTo, and that whichever name had the most hits under this hashtag were hunted down and killed by a mysterious force.
- Kinolab
- 2016
Social Media Trends and Hive Mind Justice
In this extreme imagination of social media, detectives Karin Parke and Blue Coulson try to discover the correlation between two recent deaths. They first interrogate a teacher who posted “#DeathTo @JoPowersWriter” along with a photo of controversial journalist Jo Powers on the day before Jo was found dead. The teacher discusses the popularity of this message and the hashtag, sharing that an entire online community split the cost of sending Jo a hateful message on a cake. Later on, the detectives discover that these deaths were determined by bots and the trending of the #DeathTo, and that whichever name had the most hits under this hashtag were hunted down and killed by a mysterious force.
How does this relate to the phenomenon of “cancel culture” in the real world? How can buzzwords commonly used online translate poorly into real life? How can digital social media be re-imagined so that users are less susceptible to “trends” started by bots? Is there a possibility that social media might give too much power or too high of a platform to the general population?