Put Down the iPad, Lace Up the Hiking Boots

That sneaking suspicion that you’re a more focused, creative person out in the woods? It’s true.

Have you been staring cow-eyed at a computer all morning? Fiddling with your iPhone in line at Starbucks? Checking Twitter and ESPN every four minutes on your tablet?

Good. Here’s a little quiz. What one word ties these three ideas together: water + tobacco + stove? How about widow + bite + monkey? Or, envy + golf + beans?

Psychologists call such wordplay the “remote associates test,” or RAT, and use it to study creativity and intuition. The idea is that it requires a nimble, open mind to find the connection between seemingly unrelated ideas—in this case pipe, spider, and green.

But not all minds think alike, or even like a think. New research suggests that stepping away from the shiny Apple product and into the woods can have a big impact on creativity and problem-solving. Little is known about the human brain on technology—less than even the brain on drugs—but many social psychologists fear that so much “screen time” is rewiring our neural circuitry, and not for the better.

David Strayer, a professor of cognition and neural science at the University of Utah, noticed that his brain felt more limber, his thoughts more fluid, on backcountry trips in the Southwest than they did in the lab. His undergraduates reported a similar mental boost, as did his colleagues. The peripatetic life seemed ideal for thinking about thinking.

Strayer began to organize yearly camping trips for his fellow neuroscientists. In 2010, Ruth Ann and Paul Atchley, a wife-and-husband team of psychologists from the University of Kansas, joined him on a weeklong trek through Utah’s Grand Gulch. Ruth Ann asked the group to complete the RAT before hitting the trail, and again a few days into the 32-mile hike. “It worked really, really well,” Strayer says. “We had about a 45 percent improvement. So we said, ‘This seems to be perfect. It’s cheap, and it produces a nice big effect.’ ”

Earlier attempts to study creativity in nature had proved less fruitful. “We tried bringing laptops out into the field, but people didn’t want to be anywhere near a computer after they’d been out hiking for two or three days.” And in the lab, slides and videos of pristine wilderness were a poor substitute for the real thing.

The RAT was easy to administer—no laptops involved—so Strayer and the Atchleys contracted with Outward Bound to run their experimental design. Fifty-six students were given the test; half took it before their course began, and half took it midway through. Because technology is strictly verboten on OB trips—students aren’t allowed to bring even books—the psychologists were able to measure the effect on creativity of being isolated in wilderness, untethered from the digital world.

The results, which appear this month in PLoS One, were striking. Students who took the test after a four-day immersion in the backcountry scored 50 percent higher than their coursemates. “The current research indicates that there is a real, measurable cognitive advantage to be realized if we spend time truly immersed in a natural setting,” the authors write.

The study’s sample size was small and would best be repeated across several hundred subjects, thoroughly randomized. More importantly, the design doesn’t allow Strayer and his colleagues to pinpoint what’s causing the burst in creativity: is it the interaction with nature, the disconnection from technology, or both? And is physical exercise somehow involved? (Or could it be a flash of green?)

Psychologists will want to know the answer, of course, but Strayer points out that, for everyone else, tech deprivation and nature immersion are simply different sides of the same coin. “Occasionally you’ll see the poor soul who has their smartphone out in the wilderness and is still trying to send texts and updates,” he says. Most wanderers, though, have the good sense to leave the gadgetry at home.

The authors note that the average American child spends just 15 to 25 minutes playing outside each day, but some seven and a half hours in front of a screen. Eighty percent of 5-year-olds are computer users. It’s impossible to know just what this digital noise does to the adolescent brain, but there’s a reason that neuroscientists use the word “plasticity” when talking about neural development.

Indeed, a few researchers have begun to study the impact of technology on children’s “prosocial skills”—how to be a normal, empathetic, look-you-in-the-eye type of kid, basically—and it doesn’t exactly make you want to get your niece an iPad mini for Christmas. Angry Birds is a cheap babysitter, but it’s lousy at teaching a 4-year-old to understand others’ emotions or tame her own id.

“There’s some really compelling evidence out of the social labs in Stanford that paints a dark picture about what happens if we’re connected 24/7,” Strayer says. “But you can undo some of that negativity by just disconnecting, getting off the grid, and going into a natural environment.”

Emphasis on some. For adults, the question of reversibility will only grow more urgent. Just how permanent are the neural ravages of Twitter, Gchat, and Gawker? Is a week in the Canyonlands every summer enough to restore our atrophied attention spans—or are we, the meme generation, totally hosed when it comes to consuming art more complex than a GIF or longer than 140 characters?

Strayer now wants to look at the interplay of stress, screen time, and wilderness by taking blood and saliva samples from tech-deprived hikers and comparing them to tech-saturated office drones. He’s even contemplating bring a portable EEG machine into the backcountry to study differences in brainwaves between the two groups.

The irony of inviting subjects into the mountains only to wire them up with electrodes and ask them to play iPad games is not lost on Strayer.

“I feel bad about that because I know I’ve effectively robbed somebody of an important experience,” he says, laughing. “But I guess it’s for the good of science.”

Related Posts