About a half-mile from the White House, a presentation on online extremism is taking place at George Washington University (GW). The precise setting, however, is unusual: The event is in the physics building, rather than one of the political science halls across the street, and the discussion is being led by a fast-talking and personable British physicist.
Neil Johnson starts things off in unorthodox fashion, placing some props in the middle of a conference table: a snack-size Ziploc bag filled with multicolored paperclips and a cylindrical container holding 100 Chinese “fortune sticks.”
After a brief introduction, Johnson pulls open the bag and begins spreading the paperclips — different colors to represent different people, he says — randomly across the tabletop. He then assembles the clips into small linked clusters, which grow or shrink, sometimes splitting off and merging with other clusters. His hands move quickly, rearranging the clips as deftly as a three-card monte dealer manipulates his cards.
For Johnson, the changing affiliations of the paperclips mimic what happens when, say, 100 people arrive separately at a social gathering and gradually mingle. “One minute you’re in a group of three; then it’s four, five or nine until some people break off and join another group,” Johnson says. But while it may seem random which and how many clips — or socialites — become linked together, he’s become adept at spotting the underlying patterns. And when it’s people clustering up, those patterns could have dangerous consequences, depending on the shared interests that hold the assemblage together.
Over the past half-dozen years, Johnson’s unconventional research has taken him online, studying groups not of paperclips or partygoers but of hatemongers and would-be terrorists. His conclusions suggest that rather than monitoring the behavior of individuals, hoping to pick out a few “bad apples” before they resort to violence, law enforcement officers would reap greater rewards by concentrating on the groups to which these people belong.
An analysis of such group dynamics, looking specifically for abrupt, exponential surges in membership growth, could provide warning signs of bad things to come. This strategy — drawing on principles from physics and the science of complexity, which involves the study of collective behavior — may offer an efficient method for spotting hotbeds of extremist activity before they erupt in violence.
People as Molecules
Johnson hasn’t given up his day job to fight terrorism; he still primarily studies particles such as electrons, rather than malcontents. But it’s his physics background that’s made him ideally suited to investigate collective behaviors of groups of people. As a Harvard University Ph.D. student in the 1980s — before he’d taken up complexity theory — he trained in what’s called many-body physics, the study of systems with multiple interacting particles.
“My research had nothing to do with smashing things [like atoms] apart, but rather in seeing what happens when you put large numbers of objects together,” he explains. “More is different. Even when you put identical objects together, different things can happen.” Under certain conditions, for example, swarms of electrons can move in a coordinated way, enabling electric current to pass through materials without resistance — something individual electrons cannot do. This phenomenon, called superconductivity, offers an analogy to hate groups in which “herd mentality” could induce reluctant participants to follow the crowd and support activities that a single individual would not.
Johnson’s foray into the study of online extremism was motivated partly by his personal background. He grew up near London when the Irish Republican Army was still setting off bombs, seemingly out of the blue. In the early 1990s, he spent a year as a professor in Colombia when guerrilla groups carried out ostensibly arbitrary attacks against the state. And when Johnson left his faculty position at the University of Oxford and moved to the U.S., mass shootings began making headlines with increasing frequency. It eventually dawned on him: He could use his scientific training to understand how and when such violence arises.
“Humans are not the same as particles,” admits Johnson, whose official duties at GW still focus on interactions in the quantum realm. For years, however, he’d noticed that patterns in everyday life could be explained by complexity theory — financial trends triggered by individual traders, or folks just trying to figure out the best times to avoid traffic on city streets or crowds at restaurants. Many of the principles he’d learned in physics seemed germane to these and other familiar situations. Maybe, he reasoned, the same physics that governed the boiling of water could shed light on the extremism simmering online. “As citizens, we sit and wait, having no idea when bad news will hit next,” Johnson says.
He decided he’d done enough sitting and waiting.
Random Acts of … Foreseeable Violence?
By 2014, Johnson resolved to put his expertise to work. As a first step, he assembled a group of students to help him track pro-ISIS groups communicating on VKontakte — a Russia-based social media platform with more than half a billion users.
The results of their analysis were published in the journal Science in June 2016, just days after a man who’d pledged allegiance to ISIS shot and killed 49 people at the Pulse nightclub in Orlando, Florida. The Science paper described the terrorism environment as an “online ecology” in which ISIS-sympathizing groups constantly feed off each other as they change composition over time, like evolving clusters of paper clips.
His team analyzed the online records of roughly 200 pro-ISIS groups with more than 100,000 members overall in 2015. Some groups disappeared during this period, perhaps shut down by network administrators, only to reappear later under modified names or guises. Johnson and his team developed a mathematical model that demonstrated how the size and number of ISIS-leaning groups shift over time, while providing clues to where a single group’s trajectory may be leading its members.
The total number of groups, their data showed, stayed roughly the same until cresting in September 2014 — just before an ISIS-led siege of Kobani, Syria. Johnson had already witnessed a similar pattern a year earlier in Brazil, when a proliferation of online political groups preceded violent outbreaks.
Sudden, precipitous online growth is a worrisome sign, Johnson says. “[It] can act as an indicator of conditions being right for a burst of attacks in the real world.” And it could also help law enforcement agencies, with limited resources, train their attention on a specific group, a single geographic region or a hot-button issue that’s ready to ignite.
The problem of terrorism is a growing one, especially as the internet becomes evermore pervasive in its extent and influence. Hundreds of thousands of people in the U.S. and Europe regularly visit the sites of online hate groups, including those linked to terrorist and white supremacist causes. It’s not easy to keep tabs on all those people. But Johnson’s research suggests we don’t have to. By organizing themselves into 100 or so groups, each with thousands of members, Johnson explains, the haters have “made the work much easier for us. I only have to worry about 100 [entities] instead of 100,000.”
When Good Milk Goes Bad
Back in the GW physics building, Johnson leaves the paperclips spread across the conference table and starts a second demonstration, grabbing the cup with the bamboo fortune sticks. He shakes vigorously until one stick invariably drops to the table. You can’t predict at the outset which stick will fall out, Johnson explains in his typically rushed cadence. “What matters is that interactions among sticks within this system will inevitably push one of them out.”
The sticks are confined to a cramped container, which creates conditions of close jostling. An extremist group is like a container, Johnson notes, “a tight-knit cluster in which members are jiggling each other all the time.” This jiggling — in the form of encouragement, exhortation and indoctrination — could eventually push an individual to make the shift from thinking about malicious measures to actually carrying them out.
Johnson believes these transformational shifts within collections of people, which bring individuals closer to a reckless act, are analogous to the phase transitions materials undergo upon changing their physical form. Familiar examples include the curdling of milk as it switches from liquid to solid, or the boiling of water as it changes from liquid to gas. If you place a pot of water on a lighted stove, figuring out which molecule will be the first to vaporize would be impossible (and pointless). That first molecule is no different from any other, except for being in the right place for making bubbles. What matters is knowing that the burner’s turned on — and realizing that if it’s kept on, more bubbles will rise to the surface. Matters can come to a boil in the human sphere as well, Johnson’s studies have indicated, if online hate groups are allowed to expand and fester without restraint.
Johnson recognized this pattern: He’d seen it many times during his regular academic studies in physics. As he and his colleagues reported in Physical Review Letters in 2018, the sudden appearance and expansion of pro-ISIS groups followed a progression perfectly described by gelation theory — an area of physics and chemistry that explains how liquids congeal, first into disparate clumps and then into a large cluster, or gel. The curdling of milk is, again, a familiar example. “I get a gallon from CVS, and it seems fine until one day big clumps appear out of nowhere,” Johnson notes. At that point, the process is irreversible, he adds. “You can’t uncurdle milk.”
Gelation theory is old hat in physics, but Johnson made the crucial observation that the process of milk coagulation can be characterized by the same kinds of equations that charted the upsurge of ISIS groups. Apparently, online extremism and curdling milk obey the same mathematical rules and exhibit the same exponential — and, hence, unruly — growth. Humans may not be particles, but in this case they can be just as predictable.
Rules of the Pack
Law enforcement agencies would do well to hunt for extremist groups with this telltale growth curve, Johnson says, as it could spell trouble down the road. “The membership of other [non-extremist] groups — parent swim clubs, kids’ karate teams and the like — tends to stay pretty constant,” he says. Online ISIS groups are “a different beast altogether. They don’t just sit around; there’s something really aggressive about their growth.” Groups that expand in this distinctive fashion are likely to have a greater chance of breeding menacing activity, and might therefore be the best ones to watch.
But how should authorities handle this knowledge? Johnson’s research offers insights on this front, too: The easiest way of keeping an online group from becoming too powerful, he says, “is to focus on the little groups, and keep smashing them down because they feed the big ones.” But it’s a delicate balancing act. Network administrators shouldn’t shut down the small groups too promptly either, he notes, “because we first need to see what’s going on.”
New social media algorithms, designed to reduce societal divisions by increasing connectivity, could create the opposite effect, Johnson has found. Although a “larger middle ground” will indeed emerge, some people will invariably be left out. And these alienated, left-out folks will now have an easier time finding each other, forming a cadre of disenchanted outliers that could be far more potent than individuals left to their own devices. Network administrators could, Johnson says, suggest more benign areas of interest for these disaffected sorts to rally around instead. He encourages these administrators to promote the organization of clusters of anti-hate users, which could counteract hate groups in a manner similar to the human immune system. He further recommends the introduction of an artificial group of users designed to enhance infighting between different hate groups.
Rather than confining himself to academia, Johnson has shared his insights with the FBI, the U.S. State Department and a major social media company. And he’s recently forged a partnership with The Lab @ DC — a Washington, D.C., city government group charged with using scientific insights to improve policies. “We’re trying to bring our research into the real world and match it up with actual hate activities in the greater D.C. area,” says Johnson. “What we do will ultimately complement law enforcement work, rather than replace it, making us stronger together.”
Only the Beginning
Johnson admits that his research team is in just the first stages of applying its results toward the goal of thwarting hate and terror groups. While their picture of online extremist activity is still preliminary, he’s confident in his systematic approach. The physicist’s strategy also makes sense to security experts like Seamus Hughes, deputy director of GW’s Program on Extremism and a former U.S. Senate counterterrorism adviser.
“Neil comes at this from a completely different lens than the rest of the counterterrorism community,” Hughes says. “Given how rapidly the online space is growing and changing, we need to approach the spread of extremism from as many angles as possible. Neil’s techniques can clearly contribute an important piece to this puzzle.”
Johnson, meanwhile, is gathering more pieces. Hate and violence come in many forms, and he’s broadening his focus from terror organizations like ISIS to include white supremacy groups, whose influence is clearly on the rise. In a paper published in Nature in September 2019, Johnson and his co-authors mapped out what they call the online hate universe. They revealed the social media interconnections between white supremacist groups across different countries, languages and internet platforms (including Facebook and VKontakte), calling them “resilient global hate highways.”
While the researchers were drafting the introduction to their paper in late 2018 — citing a list of recent “abhorrent real-world events” — more hate-fueled incidents were taking place in quick succession, including the killing of two black people in a Kentucky grocery store, the slaughter of 11 Jewish people at a Pittsburgh synagogue and the murder of two women in a Tallahassee yoga studio. Johnson is hoping the techniques he and his colleagues are developing could, as they become more advanced, reduce the number of such incidents in the future.
Maybe that’s why he talks so fast and moves at such a brisk pace, whether he’s typing at his computer, reconfiguring paperclips or dislodging another fortune stick from its bamboo holder: He’s trying to keep one step ahead of a swift-moving, ever-changing foe. Armed with his preliminary map of miscreant territory and the powers of Big Data, Johnson wants to drive a wedge through the hate groups of this century, just as physics split apart the atom in the last one.
If these efforts pan out, he might get to spend more time on his actual day job, studying the complex, entangled state of affairs that particles — just like humans — sometimes get themselves into.
[This story originally appeared in print as "Can Physics Help Predict When Violence Will Erupt?"]