You’ll be excused for not following an impassioned-but-obscure linguistics debate in 2009 following President Obama’s first inauguration speech, in which the newly christened leader of the free world heaped ambitious rhetoric as he explained his ambitions. “We will build the roads and bridges, the electric grids and digital lines that feed our commerce and bind us together,” the president said. “We will restore science to its rightful place and wield technology’s wonders to raise health care’s quality and lower its costs.”
This, wrote Stanley Fish in The New York Times, was naked proof of Obama’s confidence in declaring his intent to act freely and powerfully. Fish declared this usage to be the “royal we,” making Obama the latest in a long line of leaders to imply the infrangible bond between their will and the state’s—an interpretation which Professor Mark Liberman, writing over at Language Log, took to mean Fish was “full of it,” since a careful reading of the president’s language clearly showed the “we” to be an all-inclusive one, unifying his direction with the will of the people who’d elected him—the mark of a community organizer who’d had experience with motivating people toward a conjoined cause.
The real problem, then, was that the president’s words—the president’s “we’s”—could be interpreted so differently. And yet, no one else really seemed to care.
Over time, the “royal we” has made its way from the mouths of Queen Victoria and Margaret Thatcher into our writing. At best, it seems a crutch, while at worst it’s an assumed arrogance.
WHILE IT’S SINCE BECOME normalized, the vague-if-not-presumptuous usage of “we” dates back to the age of kings when a monarch would seek to solidify his divine right by referring to himself in plural, suggesting that God was on his side. Hence, “royal we.” Rulers could bundle themselves with the state, suggesting an all-encompassing, domineering perspective: I am the state, and I speak for it. Most famously, Queen Victoria uttered “We are not amused” in reference to a ribald joke that had been told at court, though the evidence suggests she was actually speaking for a specific group of women. More relevantly, she also remarked, “We are not interested in the possibilities of defeat” in reference to the Second Boer War at the turn of the century.
In turn, newspaper editorial boards eagerly adopted the phrasing when looking to satirize the ruler confident enough to speak for the whole. (“Hellay,” began an editorial in a 1991 issue of The Guardian, “Since we last spoke last Christmas a great deal has happened … we were bitten by our dog, and there has been talk of taxing us upon our enormous income.”) One famous example involves Margaret Thatcher, who was widely mocked when, upon the birth of her granddaughter, she remarked “We have become a grandmother.” Just whose baby was it, her’s or the state’s?
Thatcher’s invocation, which required hasty clarification from her press officers, was undoubtedly hoity-toity and imprecise. Indeed, one of the problems with the “royal we” is that its acceptability was contingent with the speaker’s authority, and how ready the listener is to take them at face value.
Over time, the “royal we” has made its way from the mouths of Queen Victoria and Margaret Thatcher into our writing. At best, it seems a crutch, while at worst it’s an assumed arrogance. Here’s but one example from The New Yorker’s Sasha Frere-Jones, writing a jeremiad against Jay-Z:
However thick the darkness, we drag ourselves into arguments, up to lecterns, because we have not let go of each other yet. We still think we can fix a thing that shows no sign of ever being fixed.
Just as with Obama’s usage, it’s clear this isn’t a literal case of the royal “we.” (It’s hard to imagine any music writer being that arrogant.) Instead, it’s a rhetorical trick to make the reader say “I guess I do drag myself into the argument despite the thickness of the darkness!” Because with his “we,” who is Frere-Jones speaking for? Himself, trying to avoid the English class no-no of using first person? The New Yorker, with the “we” a formal endorsement of what’s being discussed? Is it even more far-reaching than that, leaping off the screen to presume how the reader is supposed to feel? Without some kind of clarification, there’s really no way to know.
Writing in Personal Pronouns in English Language, English professor Katie Wales notes the irony: “‘We’ itself is often used, out of modesty, for example, to resist the egocentricity of a potential ‘I’; yet an egocentric ‘meaning’ will often be re-asserted.” In hiding the individual author, a consensus opinion is born. No one person thinks this thing; we do. And because the entire reason of why you’re reading is because you think the writer has something to say, you’re subconsciously agreeing before you’ve even thought otherwise.
Of course, another reasonable take is that there’s nuance in every “we.” Take cultural critic Chuck Klosterman and his newest book, I Wear The Black Hat. Klosterman has made his reputation as an author by positing esoteric pop culture theories that articulate what a lot of people are supposedly thinking; when he uses “we,” he’s both enabling the reader to feel good for agreeing and establishing his intellectual accuracy. By contrast, the reader who doesn’t buy into his theories is more likely to feel aggravated by such presumption, like this passage regarding The Wire and Breaking Bad in which Klosterman all but removes the capacity for disagreement:
If we accept that criminal activity is an extension of social forces beyond any person’s control, criminals are judged for their ethics within that sphere; in a way, we stop judging them entirely. We feel for them when they kill, and we understand why it had to happen. We actively want them to get away with murder, because we are on their side.
While Klosterman’s justification (“And I concede that when I write ‘we’ I’m really writing ‘I’—but I don’t think my sentiments fall outside the writers’ intent”) seems like instinctive common sense, it’s not. Or at least, it shouldn’t be. The same goes for Frere-Jones’ invocation, since maybe not everyone feels that beaten down by modern society. This is where the authorial “we” fails: by calcifying a narrative that’s obvious to the author and those inclined to agree with her, but all too aggravating to anyone who’s inclined to disagree.
TRACING WHEN THE AUTHORIAL “we” gained more usage isn’t easy—none of the linguists I reached out to could provide an answer, or some weren’t even aware it was becoming a thing—but it hasn’t escaped attention. Deadspin’s Tim Marchman and Reuben Fischer-Baum wrote an article gauging the “most pompous sports pundits” by tracking how many times a certain writer used pronouns in one of their columns, separating the offending usages into “I/Me/Mine” and “We/Our/Us.” While their findings weren’t necessarily surprising to anyone familiar with the world of sportswriting, there was one wrinkle: Older writers like Los Angeles Times columnist Bill Plaschke and the New York Daily News’ Mike Lupica dropped the personal pronouns. “We have a theory about this: For years, journalistic norms discouraged the use of the first person, even by columnists; immodesty of any kind was frowned upon,” they wrote. “It stands to reason that longtime newspaper columnists who came up in a more self-effacing media age and who have not yet been totally assimilated by the TV Borg would sink to the bottom of the list.”
That said, it’s not like the absence of pronouns saves Plaschke’s opinions from sounding any more pompous than they already are. But in a perverse way, what Plaschke does is more transparent and perhaps more admirable: he presents his opinions as truisms without assuming the rest of us agree, allowing plenty of room for dissent. There’s no invisible arm around the reader’s shoulder, no generalization of every perspective into a monolithic entity. And that’s the most galling thing: If commonplace usage of “we” is a sign of modernity, it’s also a sign of laziness—a rhetorical trick that’s supposed to breed trust in the person using it, but it also distracts from an absence of ironclad logic. Queen Victoria didn’t have to convince anyone of her right to speak for everyone—after all, that was kind of her job—but the rest of us (ahem) should be a little more careful.