The Future of Work: The Rise and Fall of the Job

The latest entry in a special project in which business and labor leaders, social scientists, technology visionaries, activists, and journalists weigh in on the most consequential changes in the workplace.

Since 2000, rising American productivity has become de-coupled from job growth: Despite sizzling profits and the ever-receding horizon of a brighter future for all—just on the other side of endless “disruption”—the celebrity industries of Silicon Valley and Wall Street are hollowing out middle-class jobs. When anything at all is filling the void, too often it is the cruelly misnamed sharing economy or hourly work for minimum wage, both greased with record levels of household debt.

Bethany Moreton is a professor of history at Dartmouth College and a founder of Freedom University for undocumented students banned from Georgia’s public campuses.

(Photo: Kendrick Brinson)

After a century of insisting that the secure, benefits-laden job was the frictionless meritocratic means of rewarding society’s truly valuable work and workers, today we find that half the remaining jobs are in danger of being automated out of existence. Of the 10 fastest-growing job categories, eight require less than a college degree. Over 40 percent of college graduates are working in low-wage jobs, and it isn’t in order to launch their start-up from the garage after the swing shift at Starbucks: The rate of small-business ownership among the under-30 crowd is at the lowest in a generation.

In short, the same tide that swept millions of Americans out to precarity over the 20th century is lapping at suburban doorsteps in the 21st; like that other inconvenient truth, this one can no longer be outsourced to somebody else’s kids. How few “real jobs” have to remain before we can admit that most of the world’s work has always been done under other titles, by different rules—and so take this opportunity to re-consider how we organize and reward it?

Indeed, if there is anything to be celebrated in the current jobless recovery, it is this opportunity at last to assess the job as a social contrivance, not a timeless feature of the physical universe. A dose of historical perspective helps: the job, it turns out, has only recently been considered fit for polite company, let alone transformed into one of the chief desiderata of public life. In contrast to its more venerable cousins “work” and “labor,” “job” is the red-headed stepchild in the family of human action: Prior to the 20th century, in English the term connoted fragmented, poorly executed work—odd jobs, piece-work, chance employment.

By the 17th century’s financial revolution, it also carried the moral taint of chicanery: a “jobber” dealt in wholesale securities on the nascent London Stock Exchange, the classic middleman—implicitly unscrupulous and parasitic—who connected brokers beyond the view of the public. The job knew its place: Samuel Johnson defined it in 1755 as “a low mean lucrative busy affair; petty piddling work.” And yet today the job is mourned in elegiac tones, as it flounders off into obsolescence like the exhausted polar bear swimming after a retreating ice floe.

That such a disreputable character could over the last century transform itself into America’s sweetheart, among the most sought-after rewards of citizenship and badges of social worth, indicates a joint process of rehabilitation and degradation: On the one hand, the paradigmatic job got a make-over, emerging from the national trauma of the Great Depression tricked out with such status-enhancing accoutrements as unemployment insurance, pensions, wage and hour protections, and legally enforceable health and safety standards, as well as cultural prestige as the arsenal of democracy and the wellspring of national prosperity.

On the other hand, the process of re-branding the job from sow’s ear to silk purse demanded that other forms of work suffer deliberate exclusion. The Polish operative on the Detroit assembly line made concrete gains at the expense of the Filipino fruit-packer in Fresno and the African-descended domestic worker in Atlanta. The same legislative genealogy split public work from the intimate labors of care, ensuring that the work of re-producing the workforce—the labor of feeding, healing, cleaning, rearing, and tending—would fall into a twilight zone between moral obligation and biological impulse.

Outside the charmed circle of the white- or blue-collar job, many labor markets were free in name only, structured by white supremacist terrorism, the threat of deportation, and entirely legal forms of segmentation, harassment, and discrimination. As one white Southern senator argued against federal job training in 1967: “You know somebody has to do just the ordinary everyday work. Now if they don’t do it, we have to do it.”

In short, the rehabilitation of the job during the 20th century tacitly acknowledged that waged work could no longer pass itself off as a temporary way-station en route to noble proprietorship for the virtuous yeoman: Most men in the Packard plant weren’t going to wind up as Thomas Jefferson’s “cultivators of the earth … the most valuable citizens,” or even running their own Red Lobster franchise. So the very definition of citizen value would change to embrace them, in part by pretending that other kinds of work and workers were less valuable to the republic. In return for the indignities of mass industrial, commercial, and managerial labor, then, we sweetened the bitter pill of jobbing with the privileges of exclusivity, at a terrible cost for the working majority.

But by elevating certain forms of work to the status of “real jobs” and rewarding the minority who performed them as job-holders rather than as contributing members of communities and families, we also set in motion the process of incentivizing job destruction: If the tasks formerly contained within a job could be performed more cheaply by a Third World export processing zone, or a sweatshop full of undocumented migrants, or three teenaged part-timers or two welfare-to-work moms or one prison inmate, graduate student, or unpaid intern—or the customer herself, armed with a smartphone—why on Earth should the expensive, cumbersome container endure? Why not break every job into its component parts and sell off the work to the lowest bidder, the employee who will make the fewest demands on the social fabric?

By now this logic has chewed its way, first, through the work that the job pushed into the shadows decades ago—the intimate labors of care and maintenance—and then through so many of the jobs that had their day in the sun in the last century—those manly arts of assembling and erecting. And now here it comes for the remaining jobs: the robot that took over the shop floor is today the algorithm that threatens to disrupt the tasks still grouped into jobs. For the shrinking portion of full-time, benefits-eligible employees who can still claim a modicum of scope for their skill, discretion and creativity at work, this is very bad news; for most of their fellow Americans, it isn’t news at all.

Is there even a grimly ironic silver lining to this bleak employment picture? If so, it lies here: This time, the architects of the economy have engineered their own children out of a job. The well-credentialed chickens have come home to roost in the basement rec room for the foreseeable future, and we have ourselves to blame. It should not have taken this end-game to awaken those in charge of the economy to its irrational unsustainability and its ruthless ranking of humanity, but at least now there is nowhere left to hide. Surely we can get down to the business of valuing all necessary work, no matter who does it.

For the Future of Work, a special project from the Center for Advanced Study in the Behavioral Sciences at Stanford University, business and labor leaders, social scientists, technology visionaries, activists, and journalists weigh in on the most consequential changes in the workplace, and what anxieties and possibilities they might produce.

Related Posts