Blame human cognitive limits, according to one computer scientist.
By Nathan Collins
(Photo: Peter Hellberg/Flickr)
Most people will agree that viral videos have massive impacts on just about everything, from culture to economics to policy. But lost in this discussion is the fact that very, very few of the memes, weird music videos, and perfectly worthwhile short features actually reach anything like viral status. The reason? Psychology, of course.
“The many decisions people make about what to pay attention to online shape the spread of information in online social networks,” writes the University of Southern California’s Information Sciences Institute researcher Kristina Lerman in a new paper. Those decisions, she argues, make most stuff popping up on the Internet rather less than “viral.” More to the point, they make the whole process a lot less like an exponentially exploding outbreak than researchers had previously thought.
To be clear, researchers have known for a while the label “viral” is a misnomer. For one thing, the average “outbreak” is incredibly small. In the vast majority of cases, somebody shares something, maybe a few of his or her contacts follows suit, and then everything just stops. As a result, even the most tweeted and shared stories reach only a tiny fraction of users.
What gives? The key observation lies in how biological versus online contagions spread. “In distinction to viral infection, social media users must actively seek out information and decide to share it before becoming ‘infected,’” Lerner writes. But more to the point, she continues, there’s so much information that they have to find a way to filter it before they can re-share it.
Even the most tweeted and shared stories reach only a tiny fraction of users.
A key heuristic for filtering information is position, in part through what’s called the primacy effect: People pay more attention to things they see at the top of a list, which could translate into more retweets and such. In a 2014 experiment, for example, Lerman and collaborator Tad Hogg discovered that people were three to five times more likely to recommend stories at the top of a list (of 100 items) than those in the middle or near the bottom.
Now, imagine how the primacy effect plays out on Twitter. Since Twitter feeds always start with the most recent tweets at the top, its users are more likely to retweet the most recent stories. In fact, the probability a person retweets something drops precipitously just minutes after it first shows up. That effect is amplified for more-connected users. Because they follow more people, many more stories flow through their feeds, and they’re much less likely to even be exposed to any one story—so much so that they have at most about a 20 percent chance of retweeting anything, even when 100 people they follow have also tweeted it.
What this adds up to, Lerner writes, is that, even if tweets infected people in the same way as biological viruses, the chance of being exposed to any particular item is very small—and without exposure, there can be no epidemic.
“The few success stories … keep marketers searching for formulas for creating viral campaigns. Success, however, is rare,” Lerner writes, in large part because our brains make it so hard to get infected.
||