Back in 1970 when the United States belatedly passed its first comprehensive federal laws to protect workers—the Occupation Safety and Health Act and the Mine Safety and Health Act—work in America was often downright dangerous. Over 10,000 died from work injuries every year, while over five million recordable injuries occurred, many with resultant lost work time and disability. Worse, poor control of myriad toxic chemicals like lead and asbestos and benzene resulted in an estimated 100,000 excess medical deaths annually among workers, heavily concentrated in manufacturing, mining, construction, and agriculture.
Mark R. Cullen is a professor of medicine at Stanford, and director of the Stanford Center for Population Health Sciences. He is trained as an internist and occupational physician; his lifelong research has been in the area of the health consequences of work.
The new laws, bolstered by rampant litigation, contributed to dramatic declines in serious injury, as well as marked reductions or elimination of the worst offending toxins. Of course this process was aided immeasurably by the transformation in the economy, with technology and globalization resulting in the export of many of the worst jobs and their attendant hazards. Where once over a third of the workforce was engaged in manual work, such activities have dwindled to a mere 10 to 15 percent.
But this rosy picture obscures a very different reality about modern work. If recordable injuries and occupational diseases are declining, why is it that disability is rising, not just among those reaching retirement age but far earlier in life? Workers compensation schemes, once largely focused on treatment and compensation of workplace injuries, have been besieged by claims for stress and chronic musculo-skeletal disorders, arising not just from the sectors of previous concern but increasingly from the rapidly expanding service sector—finance and health care, education and entertainment—previously imagined as “safe work.” Perhaps most worrisome, few in the workforce appear fit enough to continue work much beyond traditional retirement age despite lengthening life expectancy and active discussions of raising the Social Security age. Admittedly many of these phenomena represent not just health consequences of work but also changes in social expectations and preferences, as was evident when claims for disability rose rapidly during the Great Recession in response to job scarcity. But underlying it all is a very large burden of sub-optimal health.
So what has changed in 40 years, besides the obvious transition from largely physical to mental work, from manufacturing to service work. Back in 1970, most Americans who worked for pay did so outside the home, by daylight, at a fixed location with regular, largely weekday hours; work at night or weekends was viewed as overtime and compensated as such. These benefits had been conferred on much of the workforce by union battles earlier in the century, many of the “victories” ensconced in labor law. And although a minority of American workers were actually covered by union contracts, almost all workers enjoyed the “virtual” contract implicit in the idea that if you did your job acceptably and played by the rules you could expect to continue working for the same employer unless there was an economic shock or you chose to move on. Most large employers and many smaller ones offered a defined-benefit pension plan such that a typical worker could expect to retire at or before 65 with sufficient resources, factoring in Social Security, to maintain his earlier lifestyle. A substantial fraction of the workforce worked their entire career at essentially one job.
No longer. Beyond university professors and a few remaining trade unionists virtually no worker enjoys either job security or long-term financial security beyond what they can put away themselves, albeit with matching contributions at many workplaces. Many jobs instead are basically “contract work,” a perverse term meaning actually “no-contract work.” Work hours, locations, and shifts are far more diverse and subject to constant change; in many sectors the availability of smartphones and other devices has essentially made responsiveness to work intrusions a 24/7 expectation. This new reality appears to have tilted the place of work in one’s life in an unanticipated direction.
Where work, however demanding and dangerous, once served as ballast and support, the context for long-term relationships and appraisal of self-worth, it has morphed for many into an exchange system in which service is traded for a defined economic return with no expectation by either employer or employee of anything more. It is not hard to imagine such work would wreak havoc on an employee’s diet and exercise and sleep, or efforts to adhere to any form of medical regimen. Perhaps less immediately obvious are the impact on social relationships, family, personal security, and self-esteem. And of course, absent the incentives generated by long-term interdependence, employers have scant reason to make the work experience better beyond whatever it takes to attract workers if and when they are needed. With globalization and technological transformation showing no sign of abating, and the union movement on life support, it is hard to be sanguine it will soon get better. My fear is employment will become increasingly precarious, in every sense of the word.
For the Future of Work, a special project from the Center for Advanced Study in the Behavioral Sciences at Stanford University, business and labor leaders, social scientists, technology visionaries, activists, and journalists weigh in on the most consequential changes in the workplace, and what anxieties and possibilities they might produce.