I wish a happy 50th birthday to the National Endowment for the Humanities. For fifty years, NEH has helped scholars open the human record to new understanding, made our shared past accessible to a broad democratic public, and, in partnership with state councils, kept the humanities alive at the local level in all 50 states, reminding us that history has happened everywhere and bears the imprint of us all.
Fifty years back, this familiar agency was a profoundly new thing, and the meaning of that novelty is my subject today. Let’s remember how it came to be. The agency was officially authorized when President Lyndon Johnson signed legislation creating national endowments for the humanities and the arts in a Rose Garden ceremony on September 29, 1965. This legislation implemented recommendations from a National Commission on the Humanities that had been set in motion in 1963 by three scholarly organizations: the American Council of Learned Societies, the Council of Graduate Schools in America, and the United Chapters of Phi Beta Kappa. The commission’s report, published in 1964, speaks to a nation in confident possession of superpower status, a status that brings new risks and new choices.
Global dominance can take two forms, the report explains. One is a dominance of economic might, military power, and technological superiority. Alternatively, together with economic and military supremacy, a nation could exert a different order of power—in the ideals and image of civilization it projects—in which case, other nations would eagerly follow its high lead. Get enough economic and military superiority, and you are at best a muscled-up hegemon; add the humanities, and you could become an aspirational civilization. That’s the opportunity before the United States, and seizing it is not guaranteed. How to assure this better outcome? What’s needed is help advancing “things of the spirit”—help on the artistic and humanistic side.
As the co-chair of a recent humanities commission, I marvel that when a comparable group sat down in 1963–64, they arrived at a single recommendation: the creation of a federal agency to support humanistic activity, federally funded but with independent decision-making power. President Johnson embraced this proposal in a speech at Brown University in September 1964. Within a year, the legislation had passed with bipartisan support, the bill was signed, and NEH was born.
Since World War II, science has inevitably looked to Washington, because government alone can supply funding on the scale and with the long-term horizon that scientific research requires.
So the question I ask myself is, How did it happen that a proposal in this exact form was so persuasive to both academic and political leaders in the 1964–65 season? There is a relatively easy answer. The 1964 report is jealously mindful of the creation of the National Science Foundation in the early years of the Cold War, and science envy is the proximate cause for wanting a foundation of our own. But the idea draws on forces and developments reaching well beyond NSF, and I’d like to share the story of how I came to understand them.
Please be patient as I take you on a humanist’s excursion. Last summer—this is how I spent my summer vacation!—I read Sven Beckert’s book Empire of Cotton. Beckert traces the long history of the industrialization of cotton around the globe across five centuries, underlining the dependence of free markets on coerced labor. Once the labor of slaves on plantations in the American South supplied the cotton to English, then also New England, mills. Three or four decades after the Civil War, cotton manufacture moved to the Southern states that had once been sites of cotton growing, with women and children cast in laboring roles.
History, the recovery and re-interpretation of the past, is one primary humanistic form. The study of visual culture is another, and images brought me the next stage of my journey. When I got to this point in Beckert’s book, it put me in mind of Lewis Hine’s classic photographs documenting child labor in the North Carolina cotton mills in the early 20th century. These images bring history vividly to life, such that we can see the faces of individual children, see the relation of small bodies and large machines that Beckert had evoked.
These photographs brought something else to mind, this time retrieved from the world of law. Law is nothing if not the humanities applied: an affair of texts and interpretation and arguments over meaning, an attempt to align values and precedents with contemporary situations. Four or five years back, David Levi, the dean of the Duke Law School and a former federal judge, and Jed Purdy, a Duke law professor and political philosopher, asked me to join them in teaching a course. The course was a consideration of how the law had negotiated key moments of national crisis in American history. This was great fun, since it gave me a chance to be back in the classroom, my primary passion, but it was also a lot of work, since I had to master tons of material from the Supreme Court.
Now, one of the cases we worked through was called back to mind by the idea of cotton mills in North Carolina. The case was Hammer v. Dagenhart, decided in 1918. In 1916, Congress had passed a law prohibiting interstate commerce in goods that had been produced with child labor. (The age in question was 14.) A father of two minor sons employed in a mill at Charlotte was upset that his family should lose the children’s earnings, so he sued, arguing that the law exceeded federal authority. The case went all the way to the Supreme Court.
In judging between a federal law and the father of child mill workers, perhaps playmates of the ones imaged by Hine, the Court reasoned as follows. It recognized that by 1918 public opinion had undergone a powerful shift toward viewing child labor as dehumanizing and unethical. It acknowledged that many states had passed laws limiting the practice—North Carolina itself forbade child labor under the age of 12. But the judges could not be persuaded that child labor was a proper subject for federal regulation.
Here’s why. By the Tenth Amendment of the Bill of Rights, the Constitution reserved for the states all powers not delegated to the federal government. Regulation of interstate commerce was one of those delegated powers. But by the analysis of the Court, “the making of goods .. [is] not commerce, nor does the fact that these things are to be afterwards shipped or used in interstate commerce make their production a part thereof.” This conclusion arises from a segmenting cast of mind that sees parts as more real than the wholes they form: In this case, it sees the differences of phases of production as far more significant than their continuities. Manufacture is over and done with before cloth leaves the mill to enter trade. Ergo, manufacture is not commerce. Ergo, its practices are not subject to federal legislation. (In his tart dissent in Dagenhart, Oliver Wendell Holmes rolled his eyes at a nation that could figure out how to federally prohibit the sale of alcohol but not the exploitation of children in factories.)
These habits of mind derived from an older world where local things were not experienced as part of larger integrations, and where the federal government’s sphere was correspondingly small. (Remember that only in 1913 was Congress’ right to impose a federal income tax established.) These were once common, mainstream understandings, and they continued their mental hold as the world around them changed. During the Great Depression, the scale and duration of national economic dysfunction made it desperately obvious that local remedies were of no avail. With the ascendancy of Franklin D. Roosevelt and the New Deal, problem after problem was visualized as having a federal solution and so received federal regulation of its own. Hence the birth of federal agencies: the Farm Security Administration, the Civil Works Administration, the Federal Deposit Insurance Corp., the Federal Housing Administration, and many more.
But through Roosevelt’s first presidential term, the Supreme Court continued to balk at these innovations, for reasons wholly familiar to a student of Dagenhart. In the case that ruled the National Industrial Recovery Act unconstitutional, Chief Justice Hughes re-traces the familiar logic. The kosher slaughter of chickens at the Schechter plant in Brooklyn was a localized, self-contained activity. Even if they came from elsewhere, when the chickens reached the slaughterhouse, “the interstate transactions in relation to that poultry then ended.” Since the work involved was not in interstate commerce, it was not subject to federal regulation. Don’t you get it?, Hughes all but says to New Deal adherents. I know you are dealing with a damnably difficult problem, but our national system does not permit your solution. “Extraordinary conditions may call for extraordinary remedies…. [But] extraordinary conditions do not create or enlarge constitutional power.”
The 1935–36 court decisions mark one of the great impasses that have defined the history of this country—a moment when a set of rules and understandings perfectly well-established in one cultural world are unable to cope with powerful new facts. An earlier impasse—over the federal government and slavery—was not resolved short of civil war. This one was resolved by a change of understanding that took place between 1936 and 1937.
Roosevelt won an electoral landslide in November 1936 on a scale the country had never seen. The next spring, probably not in response to Roosevelt’s court-packing proposal, the Supreme Court began finding by 5–4 margins that the truth had changed sides. In a March 1937 decision in a minimum-wage case involving chambermaids in Washington, the Court found that the federal government had a far larger power of regulation, with the same Hughes commenting that, although the Depression emergency had not created new constitutional powers, “recent economic experience” has brought heretofore overlooked chains of cause and consequence “into a strong light,” justifying a federal role where none had seemed warranted before. Within a year, the bare majority thinking became a new consensus. (The Dagenhart decision was unanimously overturned in 1941.) From that point on, the U.S. had entered a new normal world where virtually no challenge was outside the federal government’s sphere.
I may seem to have made a long detour, but, to my mind, we cannot understand where the NEH came from without understanding how the sphere of federal authority expanded in the first half of the 20th century. There was no federal role in support of the humanities before this transformation. Without the conceptual transformation forged in the crucible of the Great Depression, it is inconceivable that a national commission could have envisioned a federal solution to the challenge of the humanities in 1964.
When the NEH legislation was adopted, of course, this was not an act by itself. It formed an integral part of the Great Society program, Johnson’s revival and extension of the New Deal amid 1960s American preeminence and prosperity. NEH proposed to do for the nation’s humanistic yearnings what the Economic Opportunity Act (signed in August 1964) did for employment, what the Voting Rights Act (March 1965) did for participation in the political process, what Medicare and Medicaid (July 1965) did for the health needs of the elderly and seriously disabled.
The public roared approval. Johnson was re-elected in November 1964 by a popular vote margin that exceeded Roosevelt’s largest percentage. Nor did this support go away when the optimism began to dim. Just as both the Republican and Democratic parties contained progressive wings in the early 20th century, both parties shared a big-government vision in mid century. Johnson launched NEH with a $2.5 million appropriation. It was under Richard Nixon, then Gerald Ford, then Jimmy Carter, that NEH received major new increments of funding, just as it was under Nixon and his successors that federal environmental programs—another new aspect of the general welfare now enforced through a federal role—saw their greatest growth. There is no understanding where NEH came from without understanding how the American dream of government got big.
That change having been accomplished, the world, of course, did not stand still. Coming forward from the NEH launch to the world of today, three changes are especially salient.
First, that day’s confidence in America’s mission to lead the world has certainly not vanished, but it has grown more complex. I find it striking that the Gulf of Tonkin resolution authorizing military action in Vietnam passed Congress in August 1964, right before Johnson’s embrace of the national endowment idea in September. I entered college at that exact time, and I can testify that few saw the toll the war would take on belief in the government’s integrity and in America’s benevolence abroad. Even after many reversals, American civilization is still unparalleled in its global attractiveness, but this has not functioned as the report envisioned.
A fascinating section of the report spies a new danger Americans are facing in the 1960s, a novel access to superabundant leisure. If the humanities do not fill this leisure with rich materials for “man’s questioning and his need for self-expression,” the 1964 report sternly cautions, “men and women [will] find nothing within themselves but emptiness” and “turn to trivial and narcotic amusements.” Materialism, trivial amusements, and gadgetry define this report’s nightmare of a deficient civilization. But as our culture has continued to hold sway over the global imagination, this has worked through, not in spite of, consumerism, personal technology, and popular entertainment: Think of the stream of popular music, film, video, and iconic brands that have held the world enthralled via iPhones and other American-born devices.
Persuasion involves engaging with others, entering into their different frame of reference, then thinking how I could make my point in terms that make sense to another
As this suggests, a second great change between NEH’s debut moment and today involves the public standing of the humanities idea. The 1964 report is confident that there is a body of knowledge that connects individuals to a rich heritage and rich domains of existential meaning. This body of knowledge is spread through a model of education with roots in classical antiquity that aims to develop the whole person, what we call the liberal arts. As you know, this confident, unitary vision of the humanities has suffered self-questioning and fragmentation among its adherents in intervening decades, and no longer commands wide public persuasion. When the Commission on the Humanities spoke, Johnson answered in the idiom of humanistic high-mindedness. When claims for humanities and the liberal arts are made today, they’re often met with skepticism, if not scorn.
The core conviction of NEH’s launch was that the federal government was the right player to advance the humanities. In a third change, this tenet too has grown uncertain. In New Deal and Great Society historiography, the age of big government is here at last, so happy days are here again. But there was always an alternative that, if eclipsed, never lost its power. In the election of November 1964, the scale of Barry Goldwater’s defeat seemed to confirm that his version of conservatism was a freak development that would never be seen again. Of course, this conservatism had older roots than many at the time cared to note. In the late 1950s and ’60s, Henry David Thoreau, a writer for the first time enjoying vast popularity, gave validation to the civil rights movement through his essay “Civil Disobedience.” But, as is often forgotten, the same works by Thoreau set forth an antistatist gospel. In the first line of “Civil Disobedience,” Thoreau repeats the adage “that government is best which governs least” and then offers this improvement: “that government is best which governs not at all.”
Such conservatism came roaring back with Ronald Reagan’s election in 1980, returning to power a philosophy of small government that produced the first large cut in NEH funding in constant dollar terms. (The second, fundamentally unreversed to this day, came with the Republican-led 104th Congress, elected in 1994.) In our time, big-government and small-government forces have reached a long-running political deadlock, with one byproduct, sequestration, inhibiting virtually any increased investment in the discretionary portions of the federal budget, humanities and research funding prominently included.
Put together the waning power of the dream of the humanities and a waning belief in the positive role of government, and you arrive at the most salient novelty in the past 50 years: the decline in the belief in public goods. The Great Society’s heyday was charged with confidence that the quality of life for every American could be improved along every axis, civic, economic, medical, educational, aesthetic, and spiritual; that these improvements would be enacted through federal government activity; and that if there was a cost to the public in taxes, that was a small price to pay for the value derived. The idiom of that day was one of sacrifices for the larger good. “Ask not what your country can do for you. Ask what you can do for your country.”
Fifty years down the road, that way of speaking sounds quite dated. Today’s public has a seriously thinned-down concept of what social goods are worth having beyond personal happiness, and an even weaker notion of why it should incur any sacrifice to fund such abstractions. The revealing figure here is not the decline of NEH funding but the decline of state funding for public higher education. The Center on Budget and Policy Priorities has documented that, since the 2008 recession, adjusted for inflation, state funding per student for public university systems has been slashed, with 15 states reporting cuts of 25 percent or more and 16 additional states with cuts north of 20 percent. Tuition costs to students and their families have risen to fill the gap. In the days when higher education was thought of as a public good worth investment from all taxpayers because an educated population increased the quality of life for all, students and their families paid around one-quarter of public higher education expenditure. Today, the personal share is near to 50 percent and shows no sign of falling.
So, how is a believer in the humanities to proceed at this time? This cause is more, not less, urgent because of the changes I have described. But there is no use ignoring the new facts we have to contend with. Our need is to find out how to advance this cause in the world that we actually inhabit.
Let me say what this means in tactical terms—and if I do not sugarcoat things, it is because, like Emerson, “I have set my heart on honesty in this chapter.” First, the federal government is not the prime target for this moment’s appeals. The American Academy of Arts and Sciences commission that I co-chaired was authorized by two congressional leaders from each party, Lamar Alexander and Mark Warner in the Senate and David Price and Tom Petri in the House. They have been thoughtful advocates, and they spoke compellingly when our report was launched. It is important to have such voices on our side, and to get as much support as we can possibly win for NEH. But major new federal initiatives are unlikely now, and though we might hope for restoration (for instance) of Title VI funds for foreign languages and cultures, the things that most need doing are not federal affairs.
In truth, the humanities are sustained by a continuum of institutions, very few of which are federally funded: elementary schools, high schools, colleges, universities, and community colleges; libraries, museums, performance halls, and other places of artistic presentation; local history centers and book clubs, and a thousand others. Since World War II, science has inevitably looked to Washington, because government alone can supply funding on the scale and with the long-term horizon that scientific research requires. But the humanities were never principally a federally supported activity, and their advocacy needs to be decentralized.
If there is one single thing humanities lovers need to press for at this time, it’s a stronger, more equitably distributed foundation of elemental literacy, the root of democracy and every humanistic power. President Johnson said in the Great Society speech: “Our society will not be great until every young mind is set free to scan the farthest reaches of thought and imagination.” “We are still far from that goal,” he added—and we are equally far from that goal today. But funds to attract and retain great K–12 teachers are still overwhelmingly state and local appropriations. In North Carolina, 66 percent of the public instruction budget comes from the state, 26 percent from localities, and a mere eight percent from federal sources. So advocates need to work nearer to home.
Further, if championing the humanities is a multifront battle now, we also need to think a great deal harder about how persuasion works. In our long time of troubles, too many humanists have been guilty of making our case the way some travelers repeat an English phrase louder and louder to an uncomprehending foreigner, as if with a little more volume, the imbecile is bound to understand. But as rhetoric (one of the oldest humanistic disciplines) might have taught us, persuasion does not involve re-stating what I already believe. It involves engaging with others, entering into their different frame of reference, then thinking how I could make my point in terms that make sense to another. This challenge should not be beyond our collective wits.
From my experience, it helps to recognize that there are different bands of audiences, each of which might respond to a different approach. First, there are the believers: not just professional humanists, but lovers of art and visitors to art galleries, lovers of music and attendees of musical performance, lovers of reading, history buffs. This is not a small group. If every one of these people spoke up for their passion, we might gain massive public traction. If every museumgoer actively advocated for art exposure for children in public schools, if every book club member spoke to a school board member or state legislator about the human needs satisfied by this means, we would have a chorus for the humanities the U.S. has not heard in years.
In a second ring are what we might term lapsed believers. One of the most frustrating features of recent years has been the number of highly educated people, some paying small fortunes to send their children to liberal arts colleges, who say that such things were great back in the day, but we can’t let people waste time with liberal learning now: The only education of value is the one that lands you a job. I used to find this perverse, but I’ve come to understand that it’s just a repetition of sayings heard so many times in “serious” media outlets that they have entered many minds as received ideas. If I ask such a person, Did you find that the things you studied that were directly instrumental led to your later success?, the answer is invariably no. If I then ask, Can you name a single successful person who had just one job-related skill, and did not instead start with a broad education that opened the mind in varied and unexpected ways?, the answer is usually no again. (Even Steve Jobs, the poster child for dropouts, studied calligraphy in college with decisive effect on Apple design aesthetics.) A lot of people beyond active converts know the value of the humanities but have forgotten. We will do ourselves a favor if, with patience and good humor, we remind them of what they already know and love.
A third ring, those who don’t have a latent belief to be re-activated, need a different approach. One thing humanists should be eager to do nowadays is to connect with people working on other problems who can be made to feel what the humanities has to contribute. In a recent project on pandemics at Duke, it was not surprising that ethicists from philosophy departments were of crucial value, since a key question in pandemics is how to allocate scarce medical resources. Less predictable was the help a literature professor could supply by explicating the role of narrative in pandemics: the storylines that get established, then giddily revised, as the emergency takes shape. It is hard to see how we will make new friends and allies if we fail to reach out.
Another thing I have seen work with the unbaptized is what I have learned to call “third-party validators.” My humanities commission, like the one in 1964, contained people of visibility and accomplishment who were not in the academy or professional arts. Their voices carried special weight in the rollout because they were not known to be pre-sold. Jim McNerney, chairman and former CEO of Boeing, has said that while his company principally looks for engineers, no one will rise beyond a certain level if they don’t also have other skills: skills in communication and cross-cultural sensitivity, the products of humanistic training. Karl Eikenberry, military commander, then ambassador in Afghanistan from 2009 to 2011, has said that military weapons will never be strong enough to solve global conflicts. At least as critical to national security are the understanding of foreign histories, foreign languages, foreign religions, foreign cultures—humanities subjects par excellence. We need to keep recruiting friends like this.
Then let’s imagine a really hard challenge: people for whom terms like “the humanities” or “the liberal arts” carry not the slightest residual meaning, are intimidating, or off-putting, or even just a bore. What to do with the hard nuts to crack? I have two thoughts. First, the humanities can take highly evolved forms, but they are rooted in our most fundamental human powers and needs. When other lines of appeal aren’t available, we need to re-connect with the forms that are familiar to people and start from there. A Duke student I know taught debate last summer to ninth graders in the Mississippi Delta. When he asked why debate might matter, one student was quick with a reply: Debate was the key to the antislavery and civil rights struggles; plus if you can argue well, you can persuade your parents. That’s a great base to build on.
And if you couldn’t get even that much of a purchase? As a last resort, we could just subject someone to the power of the experience. For instance, take them to the hip-hop musical Hamilton. The composer-performer Lin-Manuel Miranda read Ron Chernow’s biography of Alexander Hamilton on vacation and saw how he could translate Hamilton’s transcultural itinerary into a modern, transcultural music-and-dance idiom. Miranda has brought history to life and brought added richness to the performing arts by fusing them with the historical past. But there’s no need to know or care about any of that in advance. Anyone who has heard “Hey, yo, I’m just like my country / I’m young, scrappy and hungry” will have lived the energy of the humanities. Later, there might be a chance to find that this creation has complex cultural sources that can be analyzed and might even deserve support. Start with that point and you’ll get nowhere. Start with the experience and you’ll have better luck.
The start of this discussion might have sounded like a familiar story of humanistic decline, but let me say bluntly: It is not helping us to cling to the myth of the lost cause. We live at a particular moment of social history. Other times may have been more auspicious; if so, lucky them. But there’s no point pining for what we do not have. The only thing that will move us forward is to understand where we are, to assess the challenges clearly, to spot opportunities with imagination, and to use all our intelligence, passion, and ingenuity to figure out how to restore the perception of a value that has grown dim, to our collective cost.
At the end of The Prelude, Wordsworth said to Coleridge: “What we have loved / Others will love, and we will teach them how.” We care about what we value because the ability to feel that value was nurtured in us by teachers of many sorts. Let’s have the confidence to teach the humanities in that sense. People crave it more than we imagine.
This story originally appeared in Humanities as “On the Fate and Fortunes of Public Goods” and is re-published here under a Creative Commons license.