COVID-19 isn’t the only thing that spread like wildfire in 2020 — so did the conspiracy theories about it. Misleading claims that the virus was a political hoax or that the vaccines harbor microchips to allow the government to surveil its citizens blazed across social media. By December, Facebook, Twitter and YouTube had banned COVID-19 vaccine misinformation on their platforms. But this flare-up of falsehoods wasn’t just harmless noise.
A survey by the Pew Research Center last November found that 21 percent of U.S. adults don’t plan on getting vaccinated, and remain “pretty certain” that more information won’t change their mind. It’s troubling to think that simply debunking these claims — essentially, exposing them as untrue — is not enough to shift some people’s perceptions. But what if there was a way to prime people to spot disinformation before they see it? In other words, what if there was a way to create a sort of vaccine for fake news?
That’s the hope of Sander van der Linden, a social psychologist at the University of Cambridge. While much of his research revolves around human judgment and decision-making, in recent years he’s turned his attention to the problem of fake news. It’s no secret that information has the potential to spread like a virus online, and disinformation — incorrect information that’s circulated on purpose — is particularly virulent. After learning that fake news peddlers employ many of the same persuasive tactics over and over, Van der Linden set out to “inoculate” people by showing them how these strategies work. Or as he puts it: “Once you know what goes into the sausage, you probably don’t want to eat it.”
Instead of a physical shot, his delivery method of choice is online games. In 2018, Van der Linden and his colleagues launched Bad News, in which players assume the mantle of a fake news tycoon. They’re tasked with impersonating public figures, stoking fear and discrediting opponents to amass as many social media followers as possible. Similarly, in Go Viral!, another project that came out in 2020 from the Cambridge researchers, players use fearmongering and emotionally charged language about the pandemic to make messages go viral on a simulated social media network. Van der Linden’s latest game, Harmony Square, made in partnership with the U.S. Department of Homeland Security’s Cybersecurity and Infrastructure Agency, tasks the player with using disinformation to pit residents of a small, quiet neighborhood against each other.
Discover recently caught up with Van der Linden to learn more about how false information spreads, why inoculating people against misinformation works and how, exactly, that sausage gets made.
Q: How did you first become interested in misinformation and fighting the spread of fake news?
A: Initially, the interest in my specialty of influence and persuasion came from the fact that, like many social psychologists, I was interested in propaganda and how that works, particularly following events like World War II and other human atrocities that are happening around the world where people become persuaded of very dangerous ideas. That process of how people are influenced by information — and then act on it in a way that’s detrimental to others — was really my larger driver for studying this.
In 2015, before the U.S. election, my colleagues and I were studying disinformation about climate change, specifically. We started to figure out that there are a lot of commonalities in the techniques that are being used to deceive people on the issue of climate change. There are a lot of specific myths and hoaxes and conspiracies out there, but they all use these recurring techniques. The logical next question for us was: How could we inoculate people against that? We wanted to pre-expose people to weakened doses of these manipulation techniques that are used in misinformation to see if that strengthens their intellectual antibodies against it.
Q: Where did the idea come from that you could create a psychological vaccine against misinformation?
A: There’s this guy called Bill McGuire, who in the early ’50s and ’60s was studying at Yale University. It was just after the war and they were interested in how propaganda works. They kind of informally termed this a “vaccine” against brainwash. Although they never tested it on misinformation, they did some early experiments that asked, “What if you could vaccinate people against persuasive attacks by administering a weakened dose of it?” And they had some very compelling, early data on this.
It’s so relevant now — it seems like the ultimate metaphor. I found it so surprising that people had completely forgotten about this. And that’s why I reintroduced the metaphor and started expanding on it. Because what’s interesting is that at the time there was no internet; they weren’t thinking about how to actually do this in the real world.
Q: What made you think that an online game could be the best way to deliver this vaccine?
A: This really happened in my conversations with Jon Roozenbeek. He was a student at Cambridge studying Russian media propaganda; he was really interested in what we were doing. One of the things that emerged from my chats with John was the idea that we wanted to scale it up. We wanted to vaccinate people against the underlying techniques and not specific misinformation. And we also thought that would generate less reaction from people. It’s one thing for a scientist to tell people climate change is real, but people who don’t believe in climate change don’t really want to hear that. We needed another way.
Q: Tell me more about the misinformation tactics that you and your colleagues learned about when you studied fake news. What techniques will players arm themselves with when they play Bad News?
A: We started to have the goal of mapping out the techniques that underpin most fake news and misinformation. That took us about a year to read up and really try to distill all of the key techniques. We landed on a few, including polarizing people, conspiracy theories, impersonation, trolling other people, discrediting — like saying, “You’re fake news.” So we distilled it.
The first thing you do in the game is impersonate Donald Trump and declare war on North Korea over Twitter. That’s really meant to illustrate the impersonation technique more broadly. What’s interesting is that we found most people actually miss this at the beginning — because Trump’s Twitter handle is manipulated so it’s an N instead of an M. But even though it’s all fictional in the game, it’s based on real-world events. And this one was really based on a story from Warren Buffett’s account. Somebody started a fake account impersonating Warren Buffett with one T instead of two Ts. They started out doing all sorts of nonsense. The account garnered hundreds of thousands of followers in a very short amount of time, so it’s very influential. That’s what we try to inoculate people against in the first level.
The other big one is polarization. That’s probably the most difficult one for people because everyone has some preference on politics. But what we try to do explicitly in the game is tell people that it doesn’t really matter what side you’re on — it’s about driving two sides apart. That’s really the strategy. So whenever you feel riled up about an issue, try to reflect on the fact that some people are just trying to stir things up.
Q: You published several studies about the effectiveness of Bad News after its release. Did any of your findings surprise you?
A: It went in different stages, as anything with this sort of new intervention does. The first study we did, we had about 10,000 people who opted into the game that is still live; we run new experiments all the time as we get responses on a daily basis from people playing the game. What we found is that people were able to spot fake news better after playing, so they thought fake news was less reliable.
One of the biggest questions we got was how long the psychological effects last. It’s unlike a biological vaccine, where in some cases, after a few shots you have lifelong immunity. We didn’t expect thatto work in that same way.
We started following up with people week after week and launching misinformation attacks on them to see how they’re doing after they played the game. After the initial treatment, they’ll get notifications for follow-up surveys in which they are only confronted with a list of fake Twitter posts and asked to rate how reliable or accurate they find them — and how likely they would be to share them. So it’s a simulated social media feed, not a real one. “Misinformation attack” sounds a bit nefarious but what we mean is just that people are confronted with misinformation. What we found in the first study was that the vaccination effect was still there after two months with these follow-ups.
But, in one of the conditions, we found that if we didn’t follow up with people, the effect was still positive, but it decayed, significantly. So we started hypothesizing that maybe when we’re following up with people we’re actually boosting their immune response. And prompting them to memorize what they’ve learned. That’s something that we’re currently investigating further, because there are different ways of delivering booster shots.
Q: Are you worried these games might create a new crop of fake news peddlers and conspiracy theorists by showing people how misinformation works?
A: It’s probably the question we get asked most: Are you worried about teaching people how to spread fake news? We’ve approached this quit deliberately. There are two motives for why people spread fake news intentionally — one’s financial and one’s political. So one thing we don’t do in the games is we don’t show people how to make money off of fake news. And the games are very nonpolitical. We allow people to make fun of things across the political spectrum and pick sides.
Even if 1 out of a million people gets the idea, if it vaccinates 900,000 people, the benefits probably largely outweigh the risks. We don’t think it’s a substantial risk in the sense that we’re not teaching people anything new; this stuff is already out there. We’re just exposing it for people. It’s like a magic trick where we’re trying to show people how the trick works so they’re not duped by it again. Very few people want to go out and become a magician.
Q: In your 2020 study on the long-term effectiveness of inoculating against misinformation, you end by paraphrasing professor Severus Snape from the Harry Potter series: “Your defenses must be as flexible and inventive as the arts you seek to undo.” Why does that resonate with you?
A: The idea is that misinformation is evolving; if you think of it as a virus, it has new strains. Whereas impersonation was simple at the beginning, now we have deepfakes. It’s getting more complex, it’s getting new features and it’s becoming smarter. The counterstrategies — the fact-checking and the debunking — haven’t been doing the same thing. It’s very static and there’s been no advancement. The reason I started using that phrase is because it just hit me and I realized that if we want to fight the Dark Arts, we need to make our own solutions as flexible and inventive as what they’re doing. Because what they’re doing is evolving and getting more specific and using big data and getting more sophisticated. And we’re not.
Specifically, in terms of the games, when Snape says “flexible and inventive,” it really resonated with me that we need to go beyond science. We need to make it fun and entertaining and scalable. And we need to be flexible and inventive in that it needs to be real-time; we need to be able to adjust our interventions. Because only then can we try to undo some of the ways in which the dark arts of persuasion and manipulation are quickly evolving.
Alex Orlando is an associate editor at Discover.