AT THE END OF 1921, New York City hosted an extraordinary event. Mayor John F. Hylan declared November 13–19 “Health Week.” Two floors of New York’s premier exhibition hall, the Grand Central Palace (right next to Grand Central Terminal), were given over to what was billed as “the largest health exposition ever attempted.” Thousands attended the exposition. To help draw in crowds, organizers offered a “Health Clown” and “Health Characters.” There was a Harbor of Child Health, where the Child Health Family resided. “Happy,” a member of that family, appeared as a sailor. Baseball great Babe Ruth and heavyweight boxing champion Jack Dempsey were on hand to sign autographs (though it’s hard to see how either could have served as a model for healthy living, even in those days). Visitors had a chance to gawk at “fat men and women” who for the past three weeks had participated in a weight-loss competition. The winner of the “perfect baby contest” was announced (William Yarnias, eighteen months old).1 And of course no health exposition would be complete without a finalist in the quest for the perfect woman’s foot—to demonstrate, as the reporter for the New York Times put it, “that the world is as romantic as in the days of Cinderella and the Prince.”2 The honor went to Miss Elizabeth Doyle, nurse.3
The United States Public Health Bureau and other New York City departments participated, as did social service, charity, and business organizations.4 There was a plan to set up educational displays and activities throughout the city: “diet squads,” blood pressure machines, and nutritional demonstrations, including animals.5 Proceeds from ticket sales were to be used to fund public health organizations and activities at the local, national, and international levels.
But none of this was the big news. “Health Week” was actually just the middle segment of an entire “Health Fortnight.” It had been preceded by an international conference, called the “Health Institute,” and it was followed by the Fiftieth Annual Meeting of the American Public Health Association. The big news came at that meeting.
The APHA had been founded in New York City in 1872. It was the first public health organization in America, at a time when the very concept of “public health,” with its emphasis on sanitation, preventive medicine, and collective responsibility, was a novelty. In November of 1921, at the Hotel Astor, the association was celebrating its fiftieth anniversary. The Jubilee meeting brought together some of the most distinguished figures in the field. On the program were dozens of speeches, including some with such deadening titles as “Sanitation of Bath-Houses at Public Bathing Beaches,” “Proper Size of Sand for Rapid Sand Filters,” and “The Prevention and Cure of Rickets by Sunlight.” But two caught the attention of the press. Dr. Mazÿck P. Ravenel gave the Presidential Address, titled “The American Public Health Association, Past, Present, Future.” Like other top officials of the association, the president was partly interested in boosting the achievements of his organization—and why not? This was a celebration, after all. But he also used the occasion to give a capsule history of medicine over the previous half-century, speaking above all of Louis Pasteur and Robert Koch. Each achievement he listed—the discovery of a pathogenic microorganism (staphylococcus, streptococcus, pneumococcus, the Asiatic cholera spirillum, the tuberculosis and diphtheria bacilli)—represented a victory, imminent or current, over an illness that had threatened populations for centuries.
One passage in his speech was particularly striking. Telling his audience about the journals the APHA had published during its fifty-year history, Dr. Ravenel was moved to say this:
In comparing the earlier volumes with those of to-day, one is struck by the fact that the most important topics discussed in the early years are scarcely ever mentioned now. The first volume, published in 1873, is given up largely to yellow fever and cholera. One finds it hard to believe that cholera was at that time widespread in the United States, and that it existed in more than two hundred towns and cities of the Mississippi Valley.6
The other presentation that found its way into the newspapers was the keynote speech at the opening banquet. Dr. Stephen Smith was a founder of the association, and on the evening of November 16, 1921, his topic was “A Half Century of Public Health.” While Ravenel and others celebrated the concrete achievements of medicine and public health over the previous, extraordinary half-century, Smith looked at the larger ramifications. Let’s say you believed that European and American medicine had practically obliterated the ravages of infectious disease. What did that mean for you? It’s a simple question, and Dr. Smith had a simple answer: a longer life—a much longer life.
Smith spoke specifically of the drop in the death rate, making an astonishing observation. “The steady fall of the death-rate,” he said, “until it has nearly reached the vanishing point, suggests the possibility of its passing that point. What a tremendous result!” “Vanishing point”—meaning immortality? Well, almost. “We have too long been content with the false code of the Mosaic law,” he said a little later, “that limits life to three-score years and ten, with a possibility of reaching four-score years.” Thanks to modern science, however, we know what the limit should be. “Biology teaches that the normal and potential life of man is one hundred years; that every child born is adapted in physical construction and function to live a century.”7
In honor of the Jubilee celebration, the APHA published a volume of essays, titled A Half Century of Public Health. Included in that volume was an expanded version of Smith’s Jubilee speech, in which the association’s elder statesman explained the scientific basis for his extravagant claim. It was this: the normal life span of any vertebrate, he believed (drawing on the work of famed British paleontologist Richard Owen), is five times the number of years it takes for that vertebrate’s bones to develop fully. Human bones take twenty years to develop, so the normal human life span is a century.8 What the past fifty years have shown, he maintained in the speech, is that “all deaths occurring at an earlier age are due to conditions existing which are not compatible with the construction and functions of the human organisms.” And so he proposed that, to mark the anniversary, the APHA should dedicate itself to raising “the standard for the length of life” by thirty years. Maybe he wasn’t talking literally about immortality; he was talking about “life that suggests immortality.”9
No one could have been a better spokesman for this cause. Smith was almost 99 years old when he gave this speech. An editorial in the New York Times, titled “All in a Lifetime,” picked up the veteran doctor’s fondness for Biblical reference. “What preventive medicine and sanitation have done in the span of that one life to push back the shadow upon the dial of time—as the shadow receded on the dial of King Hezekiah of old—so that there is prospect for a longer enjoyment of that birthright, is one of the brightest chapters of science,” and all thanks to the scientific advances that Smith and Ravenel had mentioned in their speeches. Dr. Smith “happily illustrates” his own theory of longevity, the editor said.10
The editor jumped the gun on this point, as it turned out, for Stephen Smith died the following August, about a half-year short of his biologically allotted century. Still, his achievement was impressive, and the press celebrated him, both before and after his ill-timed demise. How had he made it to such an advanced age? He had plenty of advice for whoever would listen: lots of milk, not too much meat, plenty of sleep, no spirits or stimulants. He even recommended short skirts for women—not, heaven forbid, because he found them eye-catching (which in 1921, with hemlines almost up to the knee, they certainly were), but because he thought they were less likely than long skirts to carry dirt and bacteria from the street into the home.11
At the first annual meeting after Smith’s death, the APHA adopted a resolution to honor its founder. It noted that, in many parts of the world, over the previous seventy-five years, fifteen years had been added to the average life. It noted, too (though without statistics to support the claim), that the gains in the most recent twenty years had outstripped those of the previous fifty, adding that these gains were continuing “at an accelerated rate.” It reiterated the claim that modern science had overturned “the scriptural ideal of three score years and ten.” And finally it issued the prediction “that within the next fifty years as much as twenty years may be added to the expectancy of life which now prevails throughout the United States.” The resolution pledged the efforts of the association to the attainment of this goal, slightly more modest though it was than the one Dr. Smith had issued at the Jubilee meeting.
Unhappily, both Dr. Smith and the association were wide of the mark, and by a fair amount. According to much more recent figures, expectation of life at birth in 1922 was indeed approaching 60, but fifty years later, in 1972, it had risen only to slightly over 70 (71.1, to be precise, averaged for both genders): a single decade instead of two or three. How disconsolate Dr. Smith and the APHA members would have been, with their fondness for scriptural allusion, to know that “three score years and ten” was pretty much what the average American would get (at birth) a half-century in the future.
Still, as more recent figures also show, the assessment of the fifty years leading up to the Jubilee was essentially correct. Beginning around 1880, life for the average American—physical life—changed dramatically for the better. People in the United States began to live longer, and fewer died in childhood. The pace of change was especially pronounced between 1880 and 1930, though it was not precisely moving “at an accelerated rate,” as the APHA resolution in 1922 had held; it began to slow about ten years after the resolution was passed.
Nowhere was the transformation in human life more eloquently and forcefully expressed than in an essay written for the same Jubilee volume by Dr. Charles V. Chapin (1856–1941), one of the most influential figures in the history of public health. Chapin had spent virtually his whole career in Providence, Rhode Island, where he served as superintendent of health from 1884 until his retirement in 1932. He looked back at the movement he had led, describing its various phases and his own participation in those phases. The essay is considered a classic, succinct history of the Public Health Movement in American cities. In the concluding section, Chapin announced the movement’s achievements. The section is worth quoting at length, because it’s difficult to imagine a more eloquent and stirring testimony to the transformation that American society had undergone in a mere half-century. Chapin restricts his discussion to the public health work of cities and states.
Thus have cities and states sought to control disease. The yellow fever nightmare will terrify no more. There has been practically no cholera since 1873. Smallpox, which in former epidemics sometimes attacked half the population, is a negligible cause of death. Typhus fever is a very rare disease. Plague has not been able to gain a foothold. . . .
Typhoid fever is a vanishing disease. The diarrheal diseases caused four times as many deaths fifty years ago as now. Scarlet fever mortality has fallen ninety per cent. Diphtheria has decreased nearly as much, and the mortality from pulmonary tuberculosis has been cut in two. Infant mortality in our better cities has dropped fifty per cent. Not all this, it is true, is due to conscious community effort, but is in part, dependent upon economic and other unknown causes. Nevertheless, if only one half of this life-saving is to be credited to health work the dividend on the money and energy employed indicates good business.
Figures do not measure the terror of epidemics, nor the tears of the mother at her baby’s grave, nor the sorrow of the widow whose helpmate has been snatched away in the prime of life. To have prevented these not once, but a million times, justifies our half century of public health work.12
This was all good news. Dazzling news, if you had been around for over fifty years and could remember an era when the conditions that Dr. Chapin and others were now describing were not present—when life on average was significantly shorter, when many children died in infancy, when the next outbreak of infectious disease could occur without apparent cause at any moment. Dr. Smith had emphasized personal responsibility as the key to a long life. Chapin emphasized the network: collective, public action directed by (in this case) municipal authorities. Both were right: collective action was essential, but so was individual responsibility, as we’ll see.
Were the celebrants at the Jubilee right?
Let’s consider just two rudimentary measures of a population’s health: expectation of life at birth and infant mortality, using only averages and leaving out variations based on gender, ethnic group, geography, income, and educational level. Expectation of life at birth is the average number of years people who are born in a particular year can expect to live. Infant mortality is the number of children born in a particular year (usually reckoned per thousand) who die before their first birthday. According to figures from the 1990s, the trend in life expectancy and infant mortality from the late eighteenth century to the early twentieth century looks something like this: modest improvement (greater life expectancy, lower infant mortality) through the early nineteenth century, a setback from about 1840 until about 1870, “sporadic” improvement until about 1880, and then truly dramatic improvement.13
Expectation of life at birth rose from 38.3 in 1850, to 39.4 in 1880, to about 54.1 in 1920. The largest jump over a single decade (for years ending in zero) between 1850 and the end of the twentieth century occurred from 1880 to 1890. That jump was 5.8 years. The largest jump over a twenty-year period (for years ending in zero) from 1850 till well into the twentieth century occurred between 1880 and 1900. It was 8.4 years.
Infant mortality fell from 229 per thousand in 1850, to 225 per thousand in 1880, to 86 per thousand in 1920. By the end of the twentieth century, it stood at 7 per thousand. The largest decline over a single decade (for years ending in zero) between 1850 and the end of the twentieth century occurred from 1880 to 1890. It was 64.7 per thousand. The largest decline over a twenty-year period (years ending in zero) during the same century-and-a-half occurred between 1880 and 1900. It was 96.1 per thousand.14
In sum, the averages tell us this: starting sometime around 1880, whether you lived in the city or the country, whether you were rich or poor, well educated or illiterate, male or female, white or nonwhite, you were likely to live longer—even significantly longer—than your parents; you were more likely—even significantly more likely—to survive infancy than your parents had been; and, finally, your children were more likely—even significantly more likely—to survive infancy than you yourself had been. Those are essentially the transformations in American life that Stephen Smith and others were vaunting in late 1921 at the meeting of the American Public Health Association.
So the question is, how did the public come to know the good news? We today know that ordinary people in 1921 had a solid foundation for the belief that they themselves stood a good chance of living to a substantially more advanced age than their parents and grandparents, and for the belief that their children were much more likely than children of previous generations to survive to adulthood. But what were the sources of these beliefs in that era?
There appear to be three principal sources. One is vital statistics produced in the era we’re discussing. The second is the public health campaign, which I’ll describe in the next chapter. The third is path-breaking scientific research in the early twentieth century that promised either to extend human life for more people to conventionally recognized limits or to extend human life generally speaking beyond all conventionally recognized limits. I’ll describe this, too, in the next chapter.
Shortly before the conclusion of his essay on the history of the municipal Public Health Movement, Dr. Chapin devoted a short paragraph to vital statistics. “The proper registration of births, marriages and deaths may not seem to the public to have any very close connection with stamping out smallpox, or the prevention of typhoid fever,” he wrote, “yet it is the first and most important health work that any community can do. Sanitary science will never progress,” he continued, “and even the knowledge we have cannot be intelligently applied, without a fairly accurate system of vital statistics.”15 First and most important? More important than even the disease prevention measures Chapin had championed during his entire professional career? To take him at his word, yes. In 1908, the APHA had recognized the importance of this field by forming a Vital Statistics section.16 While vital statistics was hardly a new field, having been around in roughly its modern form since the late seventeenth century, little effort seems to have been made to bring it to the attention of the public in the United States until the early twentieth century.
Chapin, in fact, was pointing to not one but two fundamental flaws in the condition in which vital statistics found itself when he wrote these words. The first has to do with the creation and reliability of statistics. In order for figures purporting to show trends in mortality and life expectancy to give an accurate picture of a population’s health, those figures need to reflect as much of the population as possible. That means that births and deaths need to be registered as widely as possible. A major part of the public health campaign at the time that Chapin made his plea for improved vital statistics was aimed at increasing the size of the “registration areas,” that is, the areas in which births and deaths were recorded. The objective, of course, was to make registration mandatory nationwide, but since public health measures of this nature were generally left up to the states, champions of vital statistics faced the daunting task of persuading state legislatures to pass measures that would create the appropriate mechanisms for registering births and deaths. In the year 1900, only ten states, plus the District of Columbia, registered deaths, and there was no birth registration to speak of. Birth registration began, in a limited way, in 1915, but it was not until 1933 that there was mandatory nationwide registration of births and deaths.
The other flaw, implicit in Chapin’s complaint, was the limited extent to which existing information was made available to public health officials and to the general public. In order for vital statistics to be effective, people need to know about them: public health officials, in order that they might take proper measures to confront the threat of infectious disease, and members of the general public, so that they might take the appropriate personal measures to protect themselves from the same threat.
But one shouldn’t infer from Chapin’s comments that nothing had been done in recent years to improve the gathering of vital statistics and to make those statistics available to professionals and the general public. In the late nineteenth century, the primary source for vital statistics, which meant little more than general mortality rates, was the life insurance industry. Starting in 1868, the industry published the American Experience Table, which presented statistics gathered from the “experience” of some thirty life insurance companies. But because the Experience Table was based only on data drawn from insurance policyholders, it could not pretend to give a comprehensive picture of the larger population.17 In the 1880s, there were some attempts to establish statistics from census data. These figures were possibly more useful than the life insurance figures, but they were still tainted by the small number of registration areas in the country.
It was not till the end of the first decade of the twentieth century that the field of vital statistics really began to come into its own. The APHA’s formation of a vital statistics section in 1908 is part of this story. The following year, in February, the National Conservation Commission brought out a report titled National Vitality, Its Wastes and Conservation. The author was Irving Fisher, a Yale professor of economics (in fact, one of the towering American figures in the field) who, after a bout with tuberculosis in the 1890s, turned his attention to public health. Fisher was primarily a mathematician, and statistics had a particular appeal for him. The first two chapters of the report cover length of life and mortality, so they are rich with tables and figures. The statistical material, by today’s standards, is strikingly unhelpful. Because of the limited number of registration areas, Fisher’s figures cover only small and selected parts of the country (in addition to a number of European cities). When it comes to infant mortality, all he can say is that it “is probably falling.”18 That statement is bound to surprise us today, because, after all, we know that infant mortality had dropped quite significantly, that, in fact, it was less than half what it had been sixty years earlier. But the figures available to Fisher in 1909 were not sufficiently solid to support a judgment about infant mortality.
Naturally Americans were not exactly thronging bookstores to purchase copies of National Vitality. Still, as dry as the subject was and as imperfect as the book’s findings were, Fisher and his work captured the attention of the press and, as a result, the public. The New York Times ran an article on him in March of 1909, under the title “Working to Lengthen the Span of Man’s Life.” The central visual image was a chart showing the percentage of American men still alive, by decades, starting at age 50. To each decade there corresponded a picture of a man whose size was proportional to the percentage figure provided for him. If you looked closely, you saw, in addition to the caption that the Times editors placed under the chart, one that formed part of the chart itself. This caption read, “American Experience Table of Mortality.” Oddly, despite the emergence of more sophisticated statistics based on census figures, the author of this article chose to fall back on the standard life insurance table.19
Later that same year, journalist Allan L. Benson (who would enjoy a brief period of notoriety as the Socialist Party’s candidate for president in 1916) wrote an article for the Sunday magazine section of the New York Times titled “Learning the Length of Life.” Benson described the current scientific work that held out the prospect of longer life for the human species and, unlike many journalists writing in the popular press, included a number of Census Bureau charts and tables on causes of death at various ages. In fact, it would hardly be an exaggeration to say that the entire article serves as a tribute to the value and power of vital statistics. Virtually every claim Benson made about health and longevity was based directly on statistics. In addition, he made an observation significant to the social historian, namely that one source of statistical information available to members of the general public (who did not spend their time studying census charts) was the material that life insurance companies assembled in an effort to educate prospective clients. The idea presumably was that by this time the life insurance companies themselves were using data beyond just those gathered from their own clients. Insurance companies “pretty nearly know from a man’s height, weight, and occupation what kind of a disease is destined to carry him off,” Benson wrote. “This is a point worth considering, for if a man know wherein lies his weakness he can, if he choose, take steps to postpone the inevitable.” The visual impact of the article must have been quite dramatic to contemporary readers. Benson included, in the midst of his charts, cartoon-like images of Death (with Sickle), Old Age (with flowing beard and crooked walking stick), and a terrified Youth (with curls and cherubic, plump cheeks).20
One of the era’s most striking notices to the public that things had changed quite dramatically in recent decades was a tiny article in the New York Times that most readers probably did not even see, because it was oddly tucked away on page 13 of the first section. But if you read your newspaper thoroughly and turned to that page on May 23, 1913, you would have seen this arresting headline at the top of the fourth column: “Young Live Longer Now,” with this underneath: “But Those Above 40 Have a Decreased life Expectancy.” And here’s what you would have read:
A life table measuring the health conditions in this city, based on the mortality in three years, 1909 to 1911, inclusive, has been prepared by the Department of Health. In 1882 a similar life table was prepared under the direction of the late Dr. John S. Billings, who supervised the construction of the table for New York City on behalf of the Federal Census authorities, and was based upon the triennium 1879 to 1881.
The compilation shows that thirty years ago a child under 5 years could expect to live 41 years, while a child at that age at present may look forward to a future lifetime of almost 52 years, an increase of almost 11 years. The life of a child between 5 and 10 years has been prolonged from 46 to 51. A person of 25 to 30 years had an expectancy of life 30 years ago of almost 32.6 years. At present the expectancy is 34.3, an increase of 1.7 years.21
There were figures to show that life expectancy for people over 40 years of age had dropped in the previous thirty years, the decline being attributed to an increase in the incidence of “cancer, heart diseases, and kidney diseases.” The author finished by reporting the Health Department’s findings that the rise in incidence of non-infectious illnesses has been accompanied by “an increase in the consumption of spirituous liquors and nitrogenous articles of food,” in other words, “too much drink and too much meat.”
Clearly this was mixed news. Although the figures applied only to New York City, the author of the article appeared to generalize the findings to all Americans. It would have been disappointing to learn that the odds of reaching 40 were better than ever, but that after 40 the odds of reaching a truly advanced age were worse than they had been thirty years earlier. Still, one fact would have stood out. To believe what was reported in the article, in the preceding thirty years the expectation of life in the first five years of life had jumped over 25 percent, thanks to a reduction in the effects of a group of infectious illnesses. If infectious illnesses had been brought under control, then there had certainly been a decline in deaths among young children.
The document referred to in the article (but not cited there) was a table published by Dr. William H. Guilfoy, a great champion of vital statistics in this era and the Registrar of Vital Statistics of the New York City Department of Health from 1901 to 1927. It contained life expectancy figures for New York City in the two triennia, 1879 to 1881 and 1909 to 1911. The earlier figures were compiled from census data in the 1880s. The source of the figures for 1909 to 1911 was information contained in a vital statistics report for the Department of Health in 1912.22
The remainder of the second decade of the twentieth century saw the publication of additional work in vital statistics. In 1916, the Census Bureau issued a publication with the title United States Life Tables 1910, because 1910 had been a year that brought particularly good statistical news about the health of the American people.23 In 1919, George Chandler Whipple, professor of sanitary engineering at Harvard, published a book titled Vital Statistics: An Introduction to the Science of Demography. Whipple was widely known in the Public Health Movement for his role in promoting water purification, and in 1921 he would contribute to the APHA’s Jubilee volume a lengthy, authoritative essay on the topic. He was not a statistician or a demographer, and it served his purposes to say so in the preface to his book.24 The idea was not only that vital statistics were necessary for the promotion of public health but that even a nonspecialist such as Professor Whipple himself could learn the field well enough to write a textbook about it.
It would be a great exaggeration to say that all ordinary people in the early decades of the twentieth century frequently came across vital statistics and that these vital statistics alone served as a foundation for widespread optimism about longevity. As we’ve seen, on several occasions writers in the popular press presented their readers with statistical evidence for the rise in life expectancy and the drop in infant mortality (mortality in general, for that matter) that had been occurring since the final decades of the nineteenth century. And of course if you were in the market for life insurance, the salesman who sold you your policy was likely to produce a table that showed you your own expectation of life.
Still, it’s hard to imagine that access to such statistics would have served members of the American public as the primary basis for optimism about how old they’d be when they died.25 But the optimism was there, and the question is, where did it come from? Of course there’s no reliable method for uncovering the source of the feeling that life will last longer for me than for my parents and grandparents—even on the assumption that such a feeling was widespread. If we do accept the claim that attitudes about life expectancy took a turn for the more favorable around 1900, we have to recognize that there are countless truly immeasurable factors. If casual empirical observation led ordinary people to judge that fewer children were dying in, say, 1920 than had died forty years earlier (1880) or that fewer children and adults were dying of infectious disease today than forty years earlier, then perhaps those ordinary people extrapolated the view (however vaguely formulated) that everyone had a better chance of living longer in 1920 than in 1880. But no amount of casual empirical observation could offer a 30-year-old in 1920 solid, unassailable evidence that a large proportion of people just like him or her would live to a ripe old age.
What is clear, however, is that optimism about life expectancy was widely expressed in this era. Let’s return to that Jubilee meeting of the APHA. Think for a moment about the year in which this big celebration took place: 1921. Two major mortality crises were, respectively, a mere three years and a mere two years in the past: the Great War and, far more significant for the United States, the great influenza epidemic of 1918–19. Admittedly the Great War took the lives of far fewer Americans than it did Europeans, and admittedly it would not qualify as a health crisis. The flu, however, carried away over a half-million American souls. And yet, to listen to the speeches given at the celebration and to read the articles printed for the occasion, you’d never guess that an infectious illness had recently created one of the worst health crises in recorded history. In the Jubilee volume that the APHA published, there are but three passing references to the recent pandemic, though the association’s journal had printed numerous articles on the topic during the crisis. When Charles V. Chapin exultantly told his audience in 1921 of the drop in overall mortality figures in the United States, the recent year he used for purposes of comparison was 1919. He somehow forgot to mention that the flu had just taken out its final cohort of Americans in the winter and early spring of that year: close to a quarter-million, according to figures from the era (figures, incidentally, that more recent historians consider to be quite low).26 During the Paris Peace Conference, in the early months of that same year, President Woodrow Wilson himself was taken gravely ill and thought to have contracted influenza. Many participants in the conference were stricken, and some died.27 And yet somehow people believed that infectious disease had been conquered—and had been for some time (never mind the influenza pandemic). That’s why members of the medical profession and the general public could legitimately turn their attention to humankind’s next big goal: extending the span of life, either so that more people could reach an already recognized natural limit or so that the limit itself could be extended far beyond what had ever been considered possible.
Chapin was focusing on results. But results were only one part of a much bigger story that observers might have told in 1921. The bigger story included a transformation in the entire conception of the physical person that had taken place during roughly the same period as the transformation in health that the APHA Jubilee celebrated—a transformation in how Americans thought of themselves as physical beings. That’s the subject of the next chapter. The story included, as well, a transformation in how individual people saw themselves in relation to their fellows. As members of the public came to know beyond a shadow of a doubt, reductions in mortality and rises in life expectancy came about as a result of cooperation, and cooperation took place once individual citizens understood that looking after their own personal health was not only an end in itself but also a means to promoting the health of the larger public. It had to start with choice: you could choose to cooperate—or not. And yet individuals were networked whether they liked it or not, for the obvious reason that the state of the community’s health was always possibly linked to the state of their own health. It’s just that in this era, individuals came to know not only that they were networked but, to a considerable extent, why and how they were networked.