Three years ago it began to dawn on Arturo Bejar, the engineering director at Facebook, that the company’s tool for users to report inappropriate content wasn’t very good. He thought that the choice of complaints offered as responses when someone wanted a post or a photo taken down wasn’t capturing what was actually going on. When his team reviewed the reports that came in, they discovered that in most cases people weren’t objecting to posts under the official categories that warranted removal by Facebook—hate speech, violence, nudity, and harassment. Most of the time, people just didn’t like the picture of themselves someone had posted.
The spectrum of life can play out on Facebook, everything from wounded vanity to intense bullying, one of the thorniest issues in social media. Bejar wondered if recruiting researchers in human behavior, emotional intelligence, and compassion could help him better understand the social interaction that is at the heart of his company’s product. So his team created Facebook’s Compassion Research Day, an intersection between Silicon Valley and academia. Last December, I sat in the audience at Facebook’s vast Menlo Park headquarters to participate in the fourth annual meeting—open to the public. Throughout the day, social scientists from Yale, Stanford, Berkeley, and the like presented research on how people respond to and treat each other using social technology.
More than one in four American teens have witnessed cruel behavior on social networks; of them, 89 percent say that behavior takes place on Facebook, according to a recent survey by McAfee, the software giant.
People who research empathy and emotion are often perceived as studying something soft. But Facebook is built by engineers, and engineers like hard data—and Facebook is nothing if not a huge data set. After the first Compassion Research Day, Bejar put together an in-house “compassion team” to help psychologists, neuroscientists, sociologists, and other researchers conduct experiments—funded in part via stipends from Facebook—and then implement findings on the network. That team now includes dozens of engineers, data scientists, designers, and market researchers. On any given day, 10 to 12 engineers are likely to be actively working on compassion-related projects, with input from 30 to 40 others.
When Bejar took the mike for the kickoff, he was effusive, earnest, bouncing on his toes. He smiled frequently behind his dense, dark-brown beard. Signs on the wall issued various feel-good statements underscoring Bejar’s mission: PEOPLE OVER PIXELS! WE ARE CULTURE BUILDERS. LEAN IN. CONNECT. EMPATHY. “Most people don’t intend to upset or provoke with the content that they post,” he told the crowd. “It’s hard to tell someone that they’ve upset you, but when given the tools, people will use them.” And the response, Bejar adds, is positive.
Facebook engineers took the stage to explain that, in the past year, the company had instituted subtle but significant changes in what it calls “social resolution flows”: prompts and text boxes that pop up after specific actions, like clicking on “I don’t want to see this” when you come across an upsetting post. Now, if you’re tagged in a photo that you’d rather not be out there, new options pop up—“It’s a bad photo of me,” “It’s embarrassing,” “It makes me sad”—as does a text box with a brief, polite request asking the person who posted a photo to remove it. The compassion team agonized over the deceptively simple wording; in test runs, for example, “Hey, I don’t like this photo. Please remove it,” evolved into a more personal, emotion-rich message that used “I” to take ownership of feelings rather than blaming the other person: “Hey <name>, this photo is a little <embarrassing> to me and I would prefer that people don’t see it on Facebook. Would you please take it down?” The team also introduced fancier emoticons, or “stickers” in Facebook lingo, to help express emotion that doesn’t come across easily with text. (At Compassion Research Day, Dacher Keltner and Paul Piff of Berkeley explained that the ideas behind these changes are rooted in Darwin’s work on the universality of facial expressions of emotion.)
The changes have more than tripled the rate at which people send a message directly to another user asking for a photo to be removed. And of those requests, 85 percent of the time, the person who posted the photo now takes it down or sends a reply. When asked about the interactions, 65 percent of the people contacted feel positive about the person sending the message, while 25 percent feel neutral. So Facebook’s focus has moved toward how to encourage users to enter into a conversation. Rather than being a mere arbitrator, the company says it now acts according to established principles of conflict resolution.
Bejar’s compassion team has also been investigating bullying among teens, enlisting the expertise of Marc Brackett and Robin Stern of the Yale Center for Emotional Intelligence, who spoke at the conference. More than one in four American teens have witnessed cruel behavior on social networks; of them, 89 percent say that behavior takes place on Facebook, according to a recent survey by McAfee, the software giant. That is a lot of kids: 93 percent of all American teens have Facebook accounts, Brackett says.
He told the audience, “Teens take a lot of risks. They’re concerned with their reputation, and they’re hijacked by emotions. Peer relationships are a critical focus for them.” In a 30-day experiment on the site, Brackett and Stern found that in the range of emotions the young adults felt, anger and embarrassment dominated—and that teens in particular responded much better to Facebook prompts that sounded like the way they actually talked (“saying mean things about me” vs. “harassing me”).
The Yale researchers also confirmed that on Facebook, teens’ emotional lives look a lot like their lives off Facebook: Boys were more likely to report being threatened than girls, but less willing to disclose their feelings; girls were more likely to report feeling sad or embarrassed. So Facebook engineers designed specific flows to pop up when teens clicked on “remove post.” They were asked why they wanted the post removed (“annoying,” “threatened me”), what they felt (“angry,” “afraid”), and how they wanted to respond. Suggestions for responses included sending a message to the person who posted the upsetting content, sending a message to someone they trusted, and talking to someone they trusted. When teens engaged with other teens over a troubling post, 75 percent of the teens who posted the original item responded or removed the content.
When an incident is reported as bullying, users are now sent to a new bullying-prevention “hub,” a set of resources designed with input from the Yale research team. The goal, Brackett emphasized, is for Facebook to help mediate between adolescents, helping them pause and reconsider before knee-jerk emotional responses get them into trouble.
Investigating human behavior can be challenging, especially getting enough people to the lab for a study, said Emma Seppala, a psychologist and the associate director of Stanford’s Center for Compassion and Altruism Research and Education, after the meeting. “When you can do it with a massive database across cultures, with thousands of data points, that’s huge. You can see how one small change can have such a ripple effect on behavior. And if you can change people online and get them to be more pro-social offline, that’s very exciting.” Of course, she added, the benefits run both ways. If people enjoy their experience more, Facebook will do better as a business. Which helps explains the existence of Compassion Research Day.
This post originally appeared in the March/April 2014 issue of Pacific Standard as “Friends With Benefits.” For more, subscribe to our print magazine.