Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Hazards of the Job: From Industrial Disease to Environmental Health Science
Hazards of the Job: From Industrial Disease to Environmental Health Science
Hazards of the Job: From Industrial Disease to Environmental Health Science
Ebook579 pages8 hours

Hazards of the Job: From Industrial Disease to Environmental Health Science

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Hazards of the Job explores the roots of modern environmentalism in the early-twentieth-century United States. It was in the workplace of this era, argues Christopher Sellers, that our contemporary understanding of environmental health dangers first took shape. At the crossroads where medicine and science met business, labor, and the state, industrial hygiene became a crucible for molding midcentury notions of corporate interest and professional disinterest as well as environmental concepts of the 'normal' and the 'natural.' The evolution of industrial hygiene illuminates how powerfully battles over knowledge and objectivity could reverberate in American society: new ways of establishing cause and effect begat new predicaments in medicine, law, economics, politics, and ethics, even as they enhanced the potential for environmental control. From the 1910s through the 1930s, as Sellers shows, industrial hygiene investigators fashioned a professional culture that gained the confidence of corporations, unions, and a broader public. As the hygienists moved beyond the workplace, this microenvironment prefigured their understanding of the environment at large. Transforming themselves into linchpins of science-based production and modern consumerism, they also laid the groundwork for many controversies to come.

LanguageEnglish
Release dateNov 9, 2000
ISBN9780807864456
Hazards of the Job: From Industrial Disease to Environmental Health Science
Author

Christopher C. Sellers

CHRISTOPHER C. SELLERS is professor of history at Stony Brook University. He is the author or coauthor of Hazards of the Job and Crabgrass Crucible and coeditor of Dangerous Trade and Landscapes of Exposure, among other publications. He is the recipient of numerous awards, grants, and fellowships, including those from the National Science Foundation, the National Humanities Center, and the National Library of Medicine. He lives in Stony Brook, New York.

Read more from Christopher C. Sellers

Related to Hazards of the Job

Related ebooks

Medical For You

View More

Related articles

Reviews for Hazards of the Job

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Hazards of the Job - Christopher C. Sellers

    Prologue: A Source for Silent Spring

    Few more famous or influential openings have graced the pages of modern American letters than that penned by Rachel Carson. There once was a town in the heart of America, began her Silent Spring, where all life seemed to live in harmony with its surroundings. What gripped the millions of people who read this book was not this agrarian idyll itself, with its conventional farm and small-town imagery, so much as the strange blight she then invoked. Fish vanished, pigs and hens failed to breed, and birds perished into silence. The people themselves suffered and died. After conjuring up this mysterious plague as a seeming fact, Carson disarmingly revealed her hand. No community, she reassured her readers, had endured all of these calamities—at least not yet. But neither had she merely invented the story. Each event, each experience of suffering, disease, or death had actually happened, though in different times and places. A few locales had even met with several of these misfortunes. A grim specter has crept among us almost unnoticed, she warned, and this tragedy may easily become a stark reality we shall all know.¹

    What was this grim specter that Carson devoted the ensuing pages to elaborating? She had her own explanatory tale of origin: it had entered into history through the massive production of organic pesticides in the years after World War II. These substances, tailored by scientists for enhanced biological potency, had been put to such wide and indiscriminate use that they now threatened every human being … from the moment of conception until death.² But Carson’s own historical account broached additional questions for which she provided no answer. Among them, how could the toxic tinkerings of a few scientists isolated in their laboratories have culminated in so seemingly vast a threat? The mystery deepens once we realize that the warning signs had long been there, in the damage that laboratory-derived chemical processes had wrought on workers once transferred from the test tube to the assembly line. Carson’s grim specter had first materialized in the turn-of-the-century American workplace.

    There, the first Americans had sickened and died from the bitter fruits of industry that Carson later invoked on a broader stage. They succumbed to ailments they knew to be caused by the stuff of production and to others whose industrial origin they and their doctors could only guess. It was in this time, too, that questions of corporate responsibility for industrial maladies first intruded with undeniable urgency. The battle lines that then formed, the conflicts that ensued between employers, workers, and experts, prefigured those that would explode in the wake of Carson’s book.

    Not least among her debts to this earlier time and place, a cultural resource born in the early-twentieth-century factory provided much of the fabric out of which Carson wove her grim specter. She had neither seen nor heard nor smelled her tide of chemicals, nor had she personally witnessed any of the disasters she wove together in her cautionary opening—except the unexplained cancer from which she was dying. Rather, she relied for the most part on the writings and memories of contemporary scientists: members of communities whose special methods and language had rendered these substances and their effects perceptible.³ Even as her own ecological habits of mind proved crucial to her synthesis, not ecology itself but the sciences of human health supplied the core of her argument: I am impressed [she wrote her editor] by the fact that the evidence on this particular point outweighs by far, in sheer bulk and also significance, any other aspect of the problem.⁴ One discipline in particular played an originative role. Some of her most important informants in the health sciences, such as Wilhelm Hueper, head of the Environmental Cancer Section of the National Cancer Institute, had begun their careers by studying afflictions of workers. Collectively, these researchers drew heavily on the terms and techniques of an enterprise that coalesced between the 1910s and the 1930s known as industrial hygiene.

    Though some scientists and physicians did consider the impact of industrial chemicals beyond the workplace in this period, it was within and through industrial hygiene that the study of environmental health acquired its modern cast.⁵ Industrial hygienists became the first group of health professionals in the United States to concentrate on industrial chemicals and to embrace quantitative, experimental methods for studying and controlling them. They were the first to make regular use of environmental and chemical measurements, the first to tabulate lists of threshold concentration levels, and the first to devise the kinds of precise delineations between the normal and abnormal that underlie today’s environmental law and policy, as well as its science. In their confrontation with the microenvironment of the factory, they concocted what are arguably our most important means for regulating the environment as a whole.

    Along with other more explored roots, the origins of modern environmental health science thus lead in an opposite direction from that usually taken by environmental historians: not toward the farm, the wilderness, the frontier, or even the urban park, but into a setting at the heart of industrializing America.⁶ As analysts of capitalism as diverse as Karl Marx and Joseph Schumpeter have recognized, the workplace and its denizens often bore the earliest brunt of the new rounds of creative destruction by which capitalists transformed production to take advantage of new or expanding markets.⁷ The same was true of the large-scale American organic chemical industry that arose during World War I. Around this time, in this and other industries undergoing similar changes, along with others plagued by more long-standing hazards, the pioneers of an American industrial hygiene discovered their earliest opportunities for scientific enterprise.

    The very absence of nonhuman life in the early-twentieth-century workplace, along with its intensified potential for human toxicity, made it a fitting template for the hygienists’ innovative scrutiny of the physicochemical interplay between humans and their environment. Indeed, in its historicity, its human contrivance, and its resulting susceptibility to radical change, this environment mirrored the nature devoid of equilibrium or balance that today’s ecologists have elaborated.⁸ At the same time, our historical understanding of the workplace itself remains immersed in political, social, cultural, economic, and technological terms that seem to leave little room for the biological.

    Yet in the workplace, as in all human places, nature, too, resided. It became manifest not only in the machinery and raw materials that made up the means of production, but also in the bodies of the workers who applied their labor power. Many physicians, workers, and employers by the late nineteenth century recognized a material interaction between workers’ bodies and their surroundings but understood it in highly diverse, localized, and contradictory terms. By separating out one aspect of this historicized nature of the workplace in particular—the causal links between chemical and physical working conditions and worker physiology—the hygienists aimed to forge a new potential for certainty, generality, and agreement about the biological impact of the industrial habitat on its denizens. In a setting where all that is solid seemed to be melt[ing] into air, they moved to provide that sober sense about the real conditions of life that Marx had prophesied would arise, though in a guise he did not anticipate.

    My own project of recovering this biological dimension to the workplace’s past has been fortified and sharpened by recent work in environmental history. A growing contingent in this field, including Robert Gottlieb, Andrew Hurley, Arthur McEvoy, Martin Melosi, Christine Rosen, Ted Steinberg, and Joel Tarr, has trained its sights on those historical contexts most thoroughly transformed by human activity, such as the city or the corporation.⁹ William Cronon’s recent work suggests that the persistent neglect of nature in our history is itself a historical artifact: in a modern capitalist economy, devastating exploitation of the natural world could take place on a frontier at an ever greater remove from most humans’ experience.¹⁰

    No subject begs more loudly for recovery of its ecological dimensions than the history of the modern workplace. At the same time, the would-be environmental historian of this locale confronts barriers at least as confounding as the geographic ones elucidated by Cronon. Among them, long-accepted narratives of the workplace take many of its technological, social, political, and legal constituents as historical inevitabilities—a point that Arthur McEvoy has recently explored.¹¹ Even if we penetrate beyond these ideological blinders, a core epistemological dilemma persists: then as now, connections between workplace causes and their bodily effects often remained frustratingly obscure, remote, and difficult to establish. Though much of the havoc that production wrought on workers’ bodies was hard to miss, its less obvious manifestations often remained as invisible as the destruction of distant forests or prairies.

    Earlier as well as more recent waves of historical writing have opened one way of restoring this biological level to our understanding of workplace history: through attention to its discovery by the industrial hygienists. Writing at a time when this field had attained new heights of influence, Henry Sigerist, George Rosen, and Ludwig Teleky saw the hygienists in unproblematically positive terms, as the heirs of a centuries-long tradition in England and Europe who, along with their English and European contemporaries, had finally placed occupational health on a scientific basis.¹² More recent historians have delved into the complex influences on twentieth-century industrial hygiene’s shape on this side of the Atlantic. Though some like Jacqueline Corn have extended this sympathetic perspective, most of these latter-day historians, especially David Rosner, Gerald Markowitz, Alan Derickson, and William Graebner, have turned a more skeptical eye to the industrial hygienists’ claims to expertise.¹³ Whether viewing the hygienists through a progressivist or a populist lens, however, these historians have collectively illuminated the formative importance of the period between 1900 and 1940 for American industrial hygiene. Moreover, their historiographic dissension echoes the difficulty that industrial hygiene’s pioneers faced in detecting hazards as well as securing the confidence of both workers and employers.

    White middle-class professionals, most of them male but some female, occupy center stage in this story; corporate managers and owners, workers, and government officials round out the human actors. But such a tale also requires a role for the dusts and chemical fumes, the bodily processes and pathologies, on which the industrial hygienists forged their new methods and claims to expertise.

    Reckoning with this material environment as a historical actor returns us to the often subtle, less-than-obvious character of these ailments and their causes. The oldest and most widespread of these diseases, such as lead poisoning, could as easily deceive physicians as lay people, whereas newer and rarer maladies remained as invisible to the casual observer as the pesticide effects that Rachel Carson hunted down through scattered scientific citations. Especially prior to the advent of an American industrial hygiene but even far into its maturation, historical evidence remains dispersed through numerous types of sources and varies widely in nomenclature, standards, and quality. Complicating this task are the diverse interpretations that arose among contemporaries about the cause and extent of these ailments, often fueled by the bitter collisions within the workplace itself. The deep imprint of class conflict on these debates may tempt the historian to set aside questions of material environment altogether and to interpret industrial hygiene strictly in the social and economic terms through which workplace clashes have more traditionally been understood.¹⁴

    But this approach renders hollow any historical judgment on the primary means by which the hygienists staked their claims to expertise. The pioneer industrial hygienists developed new methods that they claimed to sort out better, compared with other means available to their nineteenth-century forerunners and to lay managers and workers, those ailments actually caused by the workplace.¹⁵ By suspending my own judgments about the material conditions the hygienists studied, I deprive myself of valuable historical grounds for appraising their innovative ways of distinguishing occupational from nonwork-related maladies.¹⁶

    I have thus aimed at my own assessments of the hazards the hygienists investigated. For all their difficulties, primary sources have furnished the strongest clues: hospital case records, reports by state factory inspectors and public health officers, and the occasional testimony of workers, company owners or managers, and others provide ample basis for historical reconstruction from the late nineteenth century onward. In reaching my conclusions, I have tried to avoid rigid epistemological dogma, including the hygienists’ own. On the one hand, I have accepted their contentions that more kinds of information improved their ability to sort out workplace causes of disease. Combining shopfloor surveys with detailed clinical information about workers, for instance, usually produced a more persuasive account of which maladies were occupationally related than did either form of information alone. On the other hand, the quantitative precision of clinical or environmental measurements did not necessarily outweigh less formal claims on the basis of undocumented experience.

    I have also turned in a limited way to what today’s scientists and practitioners believe. To develop a clearer sense of when pre—World War II industrial hygiene researchers successfully grappled with occupational causes of disease—and when they did not—I have compared theirs with more recent knowledge and expertise. My reliance entails a certain presentism, and let me be the first to acknowledge that future changes in our scientific understanding may render my conclusions suspect. Still, I have found present-day science to illuminate more than it obscures about the history of occupational disease research, because of the surer historical focus it allows on more elusive environmental pathologies and their causes.

    If shaped in part by this material stratum encompassing human physiology and its physical surroundings, the hygienists’ efforts were also driven and molded by the human actors better known to workplace history. The owners and managers of the corporations that arose in the late nineteenth and early twentieth centuries comprised a most crucial audience. They were the ones who established, oversaw, and maintained the workplaces where these hazards gave rise to their associated ailments. The hygienists had to force or persuade these employers to give up on an accumulated informal knowledge about workplace disease, culled through decades of industrial experience, and to embrace the purportedly superior wisdom of their own more scientific approach. Owners’ and managers’ enthusiasm for doing so was conditioned by the overall role that they accorded these diseases in the calculus of cost, profit, and cooperation that constituted their firm’s economic rationality.

    Workers, too, occupy a critical place in this story, as the ones who experienced these ailments and their immediate consequences. The hygienists had to inspire worker cooperation to gain direct information about the clinical impact of a workplace—or else they had to find a substitute way of testing health effects. Workers influenced this budding science not just through individual choices of complicity or resistance, but through their rising organization and militancy. Time and again, an energized labor movement unsettled employers’ assumptions about whether they were treating their workers fairly and catalyzed new legislative and judicial foundations for tending to worker health. Professional groups, the industrial hygiene researchers not least among them, stepped into the breach that labor activism had pried open by promising employers new means for stemming tides of labor unrest.

    Finally, there were the professionals themselves who turned to crafting this new expertise. Social scientists and nonprofessional reformers initiated this endeavor, but other professionals trained in the natural sciences—physicians, chemists, and engineers—soon took over. Though industrial hygiene evolved into an interdisciplinary and collaborative enterprise, and though engineers became increasingly central to the field, I focus here on the physicians. From their profession and its related sciences, industrial hygiene acquired most of its early technical and intellectual repertoire, beginning with the pivotal notion of occupational disease itself.

    The pioneers in this field confronted numerous challenges. From the methods and information available to them, they had to piece together an approach that would give them greater understanding and control than other professionals and lay people over the effects of the workplace on workers’ bodies. Here, they faced the choice, among others, of which diseases to study. Long-standing industrial diseases like lead poisoning had received the most attention from British and European researchers and from the hygienists’ American predecessors. Yet these were precisely the same maladies that lay company owners and managers believed they already knew how to identify and handle. Hazards associated with new chemicals and processes, on the other hand, posed uncertainties that corporate officials were more likely to acknowledge; these unknown dangers also surfaced most often in science-based industries that had already become accustomed to relying on professional experts. Industrial hygienists’ choices of methods thus remained inseparable from their choices not only of disease but also of audience.

    Questions of audience posed further alternatives. Throughout this period researchers had little or no legal power to enforce changes in the workplace. For their studies to have any effect, they had to rely on less coercive modes of authority. At every stage of industrial hygiene’s development, each investigator had to decide whom he or she should best try to influence. Would theirs be a public knowledge, broadly accessible to journalists, a lay public, and legislators? Or should it remain a private knowledge, available only to those corporate officials who invited into their factories the investigators’ critical gaze? Choice of either extreme brought its own set of risks. Too public a knowledge threatened future cooperation with company officials and could generate new laws and juridical interventions that many industrial hygienists took to be ineffective or unwise. Too private a knowledge, on the other hand, deprived investigators of further leverage if corporate motives clashed with industrial hygiene’s imperatives. It could even subtly tilt the thrust of their science in favor of employers over employees.

    By attending to all these actors, I mean to trace how occupational disease was manufactured twice over in this country: once by the manufacturers in workers’ bodies and again by the hygienists’ professional culture. Industrial hygiene thereby emerged at once as a new branch of medicine and public health and, at least potentially, as a new extension of the managerial hand. In this doubleness lies the crux of my story. For like the workers they studied, scientific practitioners of industrial hygiene never quite allowed themselves to become mere instruments of corporate profit. Instead, they found ways of maintaining an autonomy from their corporate clientele, even as the fates of the two groups became increasingly intertwined. This autonomy, like the shopfloor control that was slipping out of the hands of wageworkers in many industries, hinged on their claims to a special kind of knowledge.

    To tell this tale is thus to foreground the centrality and importance to twentieth-century workplace history of knowledge claims themselves—in this case, the conflicting representations of environmental biology. Following up on a problem posed by analysts of scientific professionalism from the philosopher Michel Foucault to the sociologist Andrew Abbott, I aim here not so much at a comprehensive history of industrial hygiene as at an account of how scientific innovations transformed this arena into what Abbott terms a jurisdiction for professional practice.¹⁷ I have centered my narrative around the efforts of a few individuals to introduce new, more or less successful ways of distinguishing occupational causes of disease onto the American scene. At every step of the way, theirs were dramas of knowledge: who should produce it, what shape it should take, how it should be used, and who should use it.

    As it turns out, far more was at stake in these dramas than the careers of industrial hygienists themselves or industrial hygiene as a profession. Questions about knowledge ineluctably engaged questions about responsibility—that of the industrial hygienist as well as of workers, private or company practitioners, engineers, government officials, a lay public, and corporate owners and managers. By broaching new dilemmas over who was at fault, the hygienists’ innovations in the realm of knowledge helped reconfigure notions about economic and political interest in the society at large.

    This story thereby opens up a new perspective on the transformations of the professions and the economy from the late nineteenth into the twentieth century that some historians have dubbed as the emergence of an organizational society.¹⁸ Work experiences, I mean to show, molded and shaped this emerging social order in ways that extended far beyond the labor movement and collective bargaining. Industrial hygienists, as they addressed largely upper- and middle-class anxieties about wageworkers, emerged as exemplars and unheralded bulwarks for what late-twentieth-century commentators have variously designated as a Third Class of workers or a new middle class: those white-collar professionals whose roles revolved ever more tightly around their capacities for producing, reproducing, and interpreting knowledge.¹⁹ As we consider why and how industrial hygiene took shape through interactions between medical, corporate, and governmental elites, several cautions are in order.

    First, as imperative as the turns to science and to a new ordering of professional labor may appear in retrospect, they hardly seemed so at first. When industrial hygiene’s pioneers began their studies and when industrialists allowed the hygienists onto factory premises, both acted in ways consistent with earlier thought and practice regarding occupational disease, even while auguring new ways to come. Rather than dismissing these earlier ways as outmoded and unenlightened, as the hygienists were wont to do, we need to attend carefully to how the late-nineteenth-century approach to occupational ailments also made sense—to corporate and government officials, to doctors, and even to workers.

    Second, even after late-nineteenth-century ways began to appear problematic, the terms and techniques of the industrial hygienists’ expertise crystallized only through a lengthy period of experiments, quarrels, and false starts. Hygienic researchers and corporate officials had to accommodate both to one another and to the limitations of available methods, as well as to worker concerns. It was a choppy, conflict-ridden process: while recognizing their dependence on one another, medical and public health professionals and company owners and managers sought different goals. For medical and public health academics especially, industrial hygiene took shape as an important site where a more disinterested form of professional practice could be forged, even as academic medicine and public health were becoming more expensive than ever. Corporate managers and owners, on the other hand, saw an industrial hygiene expertise as aiding them toward a firmer sense of what was in their economic self-interest.²⁰ By introducing more considered, informed, and widely persuasive accounts of occupational disease and illuminating the possibility of its prevention, industrial hygienists would provide corporations with new grounds for economic calculations about these ailments.

    Third, even as industrial hygiene acquired a lasting structure that allowed it to provide the fabric for new medicoeconomic and medicolegal rationalities, it failed to enforce as uniform or compelling a discipline as its professional and corporate founders had hoped. As often as not, industrial hygiene’s leaders found reasons for disagreeing among themselves, even as they encountered difficulties in disseminating their terms, tools, and practices. Appropriations took a thousand different shapes that often ran counter to the researchers’ intentions. The new order of industrial hygiene thereby generated its own varieties of disorder and dissension.

    The arrival of industrial hygiene also did not entail a uniform replacement of moral with instrumental or monetary values. However cynical corporate decisions could then become about the worker ailments they condoned, the very possibility of these calculations signaled a contrary change since the late nineteenth century. Viviana Zelizer has shown how, during this period, the growing economic worth of children—even as they left the workplace—disclosed the new social value that they were accruing.²¹ Similarly, the new economic thinking about workplace hazards that stimulated and enmeshed industrial hygiene reflected how valuable the intact worker body was coming to seem, to a point where corporations were increasingly held responsible for maintaining it. Just as employers began to pay industrial hygienists to preserve the able bodies of their employees, so workers whose occupational maladies had expelled them from the cash nexus of the labor market now came to expect recompense.

    Finally, although industrial hygienists did cast hazards and ailments in ostensibly neutral and objective terms, disentangling them from political and economic conflicts in the service of profit, this very process eventually gave flesh to some of the most sacred values of postwar environmentalism. Their evolving science brought literal and figurative embodiment to what would become a fundamental tenet of postwar environmentalism: the biological continuities not only between individual humans but also between humans and other species. Moreover, in pursuing more concrete versions of what was normal as well as multiplying knowledge about more remote and uncertain toxic threats, they opened up a new world of chemical causes and effects beneath the level of the usual clinical gaze, a borderland of shadowy abnormalities and possible pathologies. Industrial hygiene thereby provided some of the most important cultural resources for the birth of what the German sociologist Ulrich Beck has dubbed our Risk Society: where clashes over the social distribution of income and goods become overwhelmed by those arising from the production, definition and distribution of techno-scientifically produced risks.²²

    Without industrial hygiene, postwar environmentalism, Beck’s Risk Society, and Silent Spring itself would have remained unimaginable. Samuel Hays has attributed the rise of the postwar environmental movement to a transformation of values connected to a more consumer-oriented and affluent economy.²³ Especially on those questions about environmental health which so galvanized this movement, values would have remained empty without the new validity and concreteness that science brought to the threat of industrial chemicals. By providing firmer, more broadly legitimate shape to a few chemically induced maladies, through toxicological and epidemiological methods easily extrapolated to a host of other hazards, industrial hygiene investigators set this process in motion. As production diversified manyfold, they and others plied these methods and their imaginations to multiply the varieties of industrial ills—not just menaces to workers but to consumers as a whole. For decades, however, the hygienists’ etiquette of professionalism joined with their assumptions about normality to reign in the more subversive implications of their science.

    By the time Rachel Carson turned to researching her book, these restraining customs and presuppositions had begun to erode. Even a few whom corporate monies had enticed into the study of occupational diseases had begun to warn about the dangers of industrial chemicals to humanity at large. Carson picked up on these warnings and retraced the researchers’ thinking. Examining the mounting evidence that the hygienists’ methods had made possible, opening new questions about subclinical and undetected harms, and combining these with results from ecology proper, she found grave cause for concern. When Carson and others then forged the grim specter of these threats into a widely accessible and compelling shape, they triggered an avalanche of outrage. As industrial chemicals became profanations of human health and ecological well-being, the movement proceeded apace.²⁴

    New battles ensued against the corporations held to blame, and our familiar array of environmental laws and agencies soon came to pass. Largely forgotten in the accompanying uproar was how earlier in the century, corporate America had helped found and validate the very discipline whose agents, tools, and discourse now inveighed against them. Forgotten, too, were the profound historical debts that environmental health science owed, not just to the early occupational disease researchers, but to the workers through whose suffering and death industrial hygiene had been born. Now, at three decades’ remove from Carson’s book, as the benefits of the environmental movement remain largely confined to the middle and upper classes, the workplace roots of environmentalism urgently bear remembering.

    Chapter 1: White City’s Ghosts

    Opened with great fanfare on the four-hundredth anniversary of Columbus’s arrival in the New World, the Chicago exhibition of 1893 embodied the highest hopes and most willful self-deceptions of late-nineteenth-century America. Its designers and builders strove to capture the creative and accumulative frenzy of their industrializing civilization in a single spectacular place and time. Amid a landscape sculpted by Frederick Law Olmsted, the goods and technologies on display announced that the variety and advancement of American industry now rivaled those of Europe’s industrial powerhouses. The fair’s centerpiece, the Hall of Manufactures, showed off American products ranging from unadorned cable and bridge wires to the artistic finery of silverware and pottery, inside the largest building ever erected for exhibition purposes. Some four hundred additional buildings flaunted a gamut of productive ingenuity that ranged from mining to agriculture and forestry to transportation. In gilded molding and glass cages, the material power, plentitude, and diversity created in the American workplace stood assembled for all the world to see, a symbol and cipher for the age.¹

    Among the many reflections that the fair inspired, from then to now, one aspect of its origins has gone virtually unremarked. The achievement for which it stood had been purchased not just in dollars, social oppression, and devastated land, but in human flesh. Behind the very quality that earned it the famous appellation of White City lay untold tales of human pain and wreckage.

    The designers planned a white, marblelike finish for their creation; originally, they intended to cover all the buildings with a substance known as staff—an inexpensive combination of plaster of Paris and jute fiber. But staff’s whiteness faded when it was molded and exposed to the rain, wind, and smoke of Chicago’s outdoors. Other means became necessary.² They called in painters, who coated all the buildings of the fair with white lead paint—a colorant long recognized as a poison.³ The fair’s emblematic whiteness thus bore an extensive history of human damage, as this metal had been wrested from the earth, melted and separated out from other elements, transformed into a pigment, and spread over the fair buildings themselves. A vast amount of paint, some 60 tons on the Hall of Manufactures alone, coated this and the fair’s four hundred other buildings through the work of hands whose nerves and muscles had atrophied and of brains that had then gone awry.⁴ A spark of transcendent spiritual experiences for many, the consummate touch to what Henry Adams rhapsodized as a sharp and conscious twist toward the ideal, White City’s ghostly hue had left other impressions—debilitating and often permanent physical ones—on workers’ bodies.⁵

    These ailments comprised only a small part of the physiological price that workers were paying for the rapid industrialization that the fair celebrated. The coalescence of a national market, fostered by the railroads and the telegraph, had created new opportunities for profit among those who contrived ways of extracting raw materials from the land in ever-increasing volume and processing them for an ever-more far-flung variety of uses. A burst of institutional and technical innovations known by economic historians as the Second Industrial Revolution supported this accelerating flow or throughput of materials from nature to market.⁶ Throughout late-century America, from the frontiers where natural resources came to be extracted, to the factories where these substances became transformed into a widening spectrum of commodities, those in closest contact with this material flow became prone to suffering that could extend to loss of a job, permanent disablement, and death itself.

    Company by company, industry by industry, the Chicago exhibits proudly displayed commodities that gave little hint of the physical costs exacted in their making. Mining companies displayed the ores that miners had culled and milled while succumbing to the dust disease silicosis; paint companies showed off white lead products fabricated at the price of lead poisoning and chemical companies, the end-products of processes that had brought more exotic forms of intoxication.⁷ Fin-de-siècle American capitalism, in pouring forth the varied bounty of White City, pushed innumerable workers’ bodies past their limits, with near impunity and little regret.

    Progressive Era innovators like Alice Hamilton chided their predecessors in medicine and public health for neglecting these ailments: for reporting practically nothing and contenting themselves with the assurance that all was well.⁸ Yet the historical record demonstrates that in many locales, these morbid dynamics of production proved easily recognizable. Experiences such as those of the physicians at Newark’s German Hospital demonstrate what a dominant role patient occupation could play in the diagnoses of late-nineteenth-century American doctors.

    The German Hospital, a small, 50–60 bed institution founded in 1870, served that city’s community of German immigrants, including many employees at a local smelter run by the German-born Edward Balbach.⁹ Balbach’s scientific background as a chemist and his innovative, large-scale approach to production typified the entrepreneurial impulses that were transforming the American economy in this period. Starting in the 1870s, he turned to refining lead at his smelter, in addition to gold and silver, and at the time claimed to have the second largest smelter operation in the country.¹⁰ By the late 1880s and early 1890s, Balbach’s plant engaged sixty employees at a time.¹¹ During this period, the German Hospital treated a steady stream of Balbach employees for lead poisoning: in 1888, of some 212 male admissions to the hospital, 11—or over 5 percent—were listed as working at Balbach’s; all 11 were diagnosed with Blei-colic or lead colic. Including the several cases that German Hospital physicians identified among other patients, a full 8 percent of the male patients that they admitted to the hospital in 1888 received diagnoses of lead poisoning, all of which doctors attributed to workplace causes. Lead poisoning was a more common diagnosis among males than typhoid fever or pneumonia (see Figures 1–2).

    The Newark hospital’s experience with smelter workers was by no means unique. On the far side of the country, the physicians at St. Joseph’s Hospital in Tacoma, Washington, also encountered high levels of this disease among the employees of a lead smelter that opened in 1890.¹² In the twelve months of 1900, among the over sixty men employed daily at the smelter, fourteen workers were admitted for lead poisoning or saturnism (a synonym)—about 3 percent of the male admissions for that year.¹³ Lead smelters constituted a front-end link in a nationwide chain of poisoned labor that extended to white lead factories and to painters who purchased and used the toxin-bearing product. The rise in white lead production between 1870 and 1900 indicates just how much this chain swelled and lengthened with late-century economic growth: national output multiplied fourfold.¹⁴

    Figure 1. Diagnoses among males, German Hospital, Newark, New Jersey, 1888

    Source: Archives, Clara Maass Hospital, Newark, New Jersey.

    Neither was the contemporary awareness of such ailments limited to lead poisoning. What later became known as silicosis, a lung disease caused by small particles of the earth’s most common mineral, became a common diagnosis among physicians in places where many worked in mines. Already in the 1850s and 1860s physicians in coal-mining areas like Schuylkill County, Pennsylvania, had recognized a characteristic miner’s asthma or miner’s consumption among their patients of this occupation.¹⁵ When electric and gas-powered drills and chemical explosives like dynamite rapidly replaced picks, hand drivers, and black powder from the 1870s onward, the volume of silica-laden dust unleashed by miners soared dramatically. The resulting surge in lung diseases among workers in and around mines came to be widely recognized in regions where this industry predominated.¹⁶

    Silicosis did not just plague workers on the extractive frontiers but many others who hewed stone or clay. In communities where large nail-making operations existed, such as Wheeling, West Virginia, some physicians came to recognize a characteristic nail-maker’s consumption among their patients. Later identified as a silicosis variant, this ailment was presumed to be caused by the dust from nail grinding. Of forty-seven deaths recorded in the local vital statistics during the 1870s and 1880s among those whose occupations were recorded as nailer or feeder (of the nailer’s machines), Wheeling doctors diagnosed forty as dying from nailer’s consumption. Around the same time, one physician who examined 136 nailers found only one who did not have the bronchial respiration and/or lung consolidation that signaled this disease.¹⁷

    Figure 2. Lead poisoning among males in selected late-ninteenth-century hospitals

    Sources: Archives, Clara Maass Hospital, Newark, New Jersey; St. Joseph’s Hospital, Tacoma, Washington; Pennsylvania Hospital, Philadelphia; and Massachusetts General Hospital, Boston.

    Overall, the toll of workplace-related diseases was probably rising during these decades before the turn of the century. No reliable statistics are available on disease incidence and prevalence in the America of this time, so more indirect evidence about these ailments must suffice. While a hazardous occupation such as nail making was disappearing through technological change, and some innovations in white lead and other industries improved health conditions, the deleterious new technologies in mining and smelting suggest that countervailing trends predominated.¹⁸ In any event, whatever the changing impact of individual technologies, the growing number of hands that contributed to this mushrooming material flow meant that more and more Americans were becoming sick from industrial work.¹⁹ Between 1870 and 1900 the total number of manufacturing, mining, and quarry workers quadrupled.²⁰ Less sudden and more varied in its manifestations than an epidemic, the national toll of industry-related disease still grew fast enough in the decades before White City to become alarmingly noticeable.

    Making lead colors. In this scene from a lead paint factory just after the century’s turn, open windows offer the sole protection from the lead dust in the foregrounded worker’s shovel. (Courtesy of the Johns Hopkins University Libraries, Baltimore, Md.)

    Across the Atlantic, many decried this very trend in their own lands. Germany’s economy expanded just as explosively as did that of the United States during the late nineteenth century, and Britain, though its earlier start allowed for less of a percentage gain, steadily extended its industrial capacity.²¹ Industries producing silicosis as well as other dust diseases and lead as well as other forms of poisoning all contributed to the expanding manufactures of these countries. During the same period British and German physicians, through clinical exams of workers, close scrutiny of factory environments, and occasional experimentation, developed national and international literatures about occupational disease. By the turn of the century, Ludwig Hirt in Germany and John Arlidge in Britain had already composed textbooks that summarized the health effects of an unprecedentedly comprehensive and modern range of occupations.²² As reflected in the cumulative listings in the surgeon general’s catalog under Diseases of Occupations, the publications on occupational diseases in these countries show how perceptible the tide of disease brought on by the Second Industrial Revolution had become (see Figure 3).

    Heading up barrels of dry red lead. Blacks and other minorities were often employed at such dangerous tasks, which inevitably stirred up lead dust. (Courtesy of the Johns Hopkins University Libraries, Baltimore, Md.)

    Despite intensive local encounters like those in Newark and Wheeling, however, written reports on these diseases appeared less frequently in the United States, and public summations of the national experience such as those at White City imparted work-related hazards and ailments a low and insubstantial profile. In the 1893 gathering at Chicago, two opportunities for assessing the extent of occupational diseases in America remained virtually unexploited. Taking the idea from an earlier London exhibition, the organizers made room for a display on Hygiene of the Workshop and Factory. Only one American firm volunteered any wares. At the fair’s Auxiliary Congress, the actuary William Standen spoke on The Effect of ‘Occupation’ and ‘Habits’ on Life Insurance Risks. Aside from the risks of alcoholism or traumatic accident, he wistfully concluded, most hazards of occupations were unknown and almost incalculable.²³

    Figure 3. Hygiene of Occupations, Index Medicus listings, 1891–1898

    Standen’s surmise exposed the irrelevancy of occupational disease literature from other times and places to the vast majority of late-nineteenth-century Americans. Writings about work-related ailments already had a long and august history. Centuries prior to the birth of the large American corporations, in the early 1700s, the Italian physician Ramazzini had composed the first-known volume devoted exclusively to the hygiene of occupations.²⁴ Even in the pre–Civil War United States, one American author, Benjamin McCready, had written an entire treatise on the subject.²⁵ After the war a few American texts such as an 1885 piece by Roger Tracy, tucked in a series of translated German volumes, had presented findings from the extensive British and European literature to fellow countrymen and countrywomen.²⁶ Yet these texts meant little to the actuary William Standen.

    The despairing ignorance of those like Standen requires a fuller explanation if we are to fathom why the American study of these ailments blossomed not at this time, but later on. In cursory retrospect, the facts and methods that proliferated in the United States after the turn of the century may seem to have smoothly and inevitably disseminated from those forged earlier on the opposite side of the Atlantic. But the British and European writings as well as the scattered experiences of Americans themselves already provided impetus to investigations of occupational diseases in the 1880s and 1890s, to limited avail. Formidable obstacles hampered a more ambitious,

    Enjoying the preview?
    Page 1 of 1