The Problem With Google’s Perceived Omniscience

It’s called the “filter bubble,” and it’s helping Google play to our prejudices.
google filter bubble

Imagine a beer summit between two men—call them Frank and Hank. From a young age, Frank has inhabited a conservative landscape deep in the heart of a profoundly red state—Rush Limbaugh the only sound coming from the radio, Fox News the only channel on TV. Hank, meanwhile, grew up in a liberal landscape, the heart of a blue city—Randi Rhodes on the radio, MSNBC on the TV. Hank and Frank both travel for work, and, one happy hour, end up sitting next to each other at the most neutral bar in the swingest city of the swingest state. They chat and quickly the conversation turns to what’s wrong with America.

Who leaves with his mind changed?

The answer, of course, is no one. Hank leaves thinking gun laws should be stricter; Frank will still believe the only good government is a small government. Political beliefs—any beliefs, really—are as much a product of environment as anything. No one exits the womb an NRA die-hard, even if their bumper sticker says otherwise. So, sure, Hank and Frank will hear each other out—they’re not monsters—but they’re not really going to entertain the other’s opinion. Because, you know, the other person’s just so wrong.

“A public search engine is the natural sequel to PBS and NPR.”

Now: Take Hank and Frank and extend that scenario to every person on Earth. Because that’s the world Google is quite possibly creating.

The concept at work is the “filter bubble,” the Internet’s version of the echo chamber. Like conservatives who watch only Fox News, or liberals who watch only MSNBC, the Internet’s beginning to act in a way whereby users aren’t exposed to points of view separate from their own. Liberals rarely head to the Drudge Report, conservatives steer clear of Mother Jones—that we already know. But the same exclusionary concept extends to social networks as well. If an old high-school friend starts getting racist on Facebook, the block button’s right there; and you don’t follow people you think are idiots on Twitter, unless you’re engaged in a “hate-follow,” which inherently negates the persuasive force of that person’s account.

“It’s not something unique or technology related,” writes Matt MacPherson, who, in 2006, started the Church of Google, a mostly tongue-in-cheek celebration of the “closest thing” man’s created to a real, provable god. “We tend to hang out with people who are like us, elect into office people we like on a personal level and not because they’re necessarily the best candidate. We consume what we’re already interested in. So the problem is with our human nature; not the algorithms designed to appease that nature.”

If we’ve managed to navigate the biased world MacPherson describes, what’s different on the Internet? First and most important: The biases mentioned above play out in full view, easily recognizable as bias. Even if you watch only Fox News, you can’t completely ignore the idea that some people think it’s biased. Same goes for MSNBC, CNN, NPR, any kind of media, really. At this point, a certain skepticism is inherent in our relationship with news media.

That’s not the way it is when using Google, which is seemingly as innocuous as a pencil. And that innocence, that apparent lack of bias even being a consideration, gives it tremendous power.

“There’s this crazy idea that all the billions of webpages have been thoroughly vetted and reviewed, and this omniscient source found the best,” says Dr. Robert Epstein, a psychologist from Harvard and former editor of Psychology Today who studies how search engines affect behavior. “That whoever or whatever is doing the searching for us is infallible and omniscient.

As our world continues to become more bubble-ized and personalized and curated, one of the things we’re giving up in return is the idea that someone else might be right.

It’s important to keep in mind just how Google works. It is not, despite how it feels, looking through all the webpages in existence and finding those that offer the most accurate information. Rather, Google’s algorithm uses proprietary biases (paid placement—90 percent of their revenue comes from advertising), outside influence through purposeful attempts to game the system (search-engine optimization), and the user’s own biases (cookies) to deliver results. This doesn’t ruffle most users; who doesn’t love getting exactly what they want?

And that’s kind of the problem.

“Google isn’t necessarily giving you the best information,” Epstein says, “but what it thinks you want.”

The distinction is important. If you’re pro-choice, cookies pick up on that tendency and your search results will give you pro-choice results. If you’re pro-life, same thing. This goes for topics like climate change, GMO dangers, what Hillary Clinton’s real politics are, the theory of evolution, and whether or not George W. Bush sanctioned the 9/11 attacks; if your computer’s cookies describe a person who believes that jet fuel can’t burn hot enough to melt the World Trade Center’s steel beams, well, Google won’t be pushing the Scientific American article debunking that claim.

“You won’t see material outside of this bubble as the algorithm gets to know you better,” Epstein says. “That’s part of the problem, that the algorithm more and more is sending people customized rankings based not just on their IP address and location, but based on cookies which identify you as an individual.”

So, rather than being exposed to alternative opinions or points of view, or even the actual science that disputes your own beliefs, we end up living like Hank and Frank, trapped in our own respective bubbles of “knowledge,” knowing beyond belief that the person sitting across the table with the other opinion is wrong, wrong, wrong.

“I think this is probably not a good thing in the long run,” Epstein says.

How did we get here? Branding has certainly helped.

It has taken years for Google to achieve the appearance of impartial omniscience. Part of this positive branding is the fact that they’re doing really cool things, be it mapping the world or trying to perfect self-driving cars. But it’s also because their name sounds so cool. It’s Google. It’s enormous. It implies all knowledge, all of the world’s websites in one place. And that implication—despite not being accurate—gives it tremendous power.

In 2013, Epstein put together a study looking at how search engines can affect the results of an election. The study showed that by simply putting one candidate’s name above the other, a close election can swing toward the top candidate. This shouldn’t be surprising, as the order of search results have already been shown to be tremendously important; another study shows the top search result gets 32.5 percent of the clicks, second place gets 17.5 percent, third place gets 11.5 percent, and if you wind up on the second page, you might as well just shut down and start over.

That kind of dominance over a culture isn’t what most people have in mind when we use Google.

A final option may be taking the search engine public, opening up the code so it can be analyzed and looked over by unbiased interests. Google’s unchecked and hidden algorithm is, after all, unprecedented gatekeeping power for a private company.

If you want further proof of the company’s link-manipulation, look at two recent announcements from Google: First, they’re demoting websites that aren’t mobile-friendly, a muscle-flexing move that’s causing all sorts of panic. Second, rather than using their current method of ranking links based on other links—that is, based on which sites are the most popular—Google will start coming up with its own method for determining the quality and accuracy of a website. “That to me is scarier than what they’re doing now,” Epstein says. “That implies that a company that is accountable to nobody but their shareholders is going to make decisions on what is true and false.”

It’d be one thing if Google was simply a repository of knowledge accessible at your fingertips, a digital Dewey Decimal System. But Google isn’t a public service. It’s a company with a market capitalization of nearly $395 billion dollars, the second most valuable company in the world next to Apple.

What’s the solution? One is to simply trust them. As Google executive Eric Schmidt said somewhat ominously back in 2010: “There are many, many things that Google could do, that we chose not to do.” And so far they’ve been pretty ethical with what they’ve done. But as the history of private industry shows, you can’t always trust companies—or CEOs—to follow the “don’t be evil” motto forever.

Beyond that, there are ways Google can work to suppress the Search Engine Manipulation Effect. One is to put a warning at the top of the results, alerting users that the results are tailored to each user’s bias. “That causes people to be a little more objective,” Epstein says. Individual warnings next to the results based on the bias of each link also help. And creating an algorithm that alternates between biased and unbiased results has been shown to limit the effect dramatically.

A final option may be taking the search engine public, opening up the code so it can be analyzed and looked over by unbiased interests. Google’s unchecked and hidden algorithm is, after all, unprecedented gatekeeping power for a private company.

“In the past, phone books—with a monopoly on the flow of certain information to the public—were prevented from not listing businesses even when paid to do so,” Evan Leatherwood wrote in a 2013 issue of the Nation. “In the 1990s, similar reasoning led to the ‘must carry’ rule, which required cable companies to carry certain channels to communities where they were the only providers of those channels. A public search engine is the natural sequel to PBS and NPR.”

Whatever the solution, it should come sooner rather than later. Because as our world continues to become more bubble-ized and personalized and curated, one of the things we’re giving up in return is the idea that someone else might be right.

The Sociological Imagination is a regular Pacific Standard column exploring the bizarre side of the everyday encounters and behaviors that society rarely questions.

Related Posts