Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

PDQ Public Health
PDQ Public Health
PDQ Public Health
Ebook447 pages4 hours

PDQ Public Health

Rating: 0 out of 5 stars

()

Read preview

About this ebook

To introduce a firm understanding of public health, PDQ Public Health presents the history of how the tools of public health have evolved and are applied for the detection, measurement, and intervention in public health threats and risks.
LanguageEnglish
Release dateDec 1, 2010
ISBN9781607951476
PDQ Public Health
Author

David L. Streiner, PhD

David L. Streiner, PhD, is Professor of Psychiatry at the University of Toronto, Toronto, Ontario, Canada.

Read more from David L. Streiner, Ph D

Related to PDQ Public Health

Related ebooks

Medical For You

View More

Related articles

Reviews for PDQ Public Health

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    PDQ Public Health - David L. Streiner, PhD

    1

    What is Public Health?

    Again and again in our planning we ponder the increasing scope of public health demands, the rapidity of change, the mounting pressures of new and growing health problems, the shortages of qualified personnel, the need to obtain more knowledge through research, and above all the increasing difficulty of coordination in the planning and execution of activities.

    —Roy J. Morton (1958)

    A BRIEF HISTORY OF PUBLIC HEALTH


    Those who don't study the past will repeat its errors; those who do study it will find other ways to err.

    —Charles Wolf, Jr. quoted in the Wall Street Journal, Feb. 26, 1976

    Like the weather, everyone seems to be talking about public health, and similarly, no one seems to be doing anything about it. Television, print media, and the Internet are replete with stories, articles, and commentaries about various threats and risks to public health, and discussing the political, policy, and programmatic challenges and what society and individuals can do. Taking a historical perspective, it has always been thus. Society is concerned with its collective health and well-being. While the explosion of modern media sources provides more outlets and therefore greater volume commenting on school closures and H1N1 influenza, a reader of newspapers or a viewer of news reels in the last century would have seen the same messages about polio in the 1950s (Oshinsky, 2005) and virtually every other disease outbreaks since the dawn of mass communication. Similarly, newspapers of the 19th century would have carried stories about cholera or yellow fever, and accounts of plague graced the news media of the century before that.

    These examples represent one view of public health. This public view focuses on the social, community, and organizational responses to perceived threats and real disease risks.

    fig1-1

    FIGURE 1-1. Public health poster from Spanish flu era. 'Prevent Disease. Careless Spitting, Coughing, Sneezing, Spread Influenza and Tuberculosis'. Image courtesy Rensselaer County Tuberculosis Association, Troy, New York (Wiki Commons)

    fig1-2

    FIGURE 1-2. Example of a public health poster from World War I.

    It’s like a history of public health made up of a scrapbook of oral instructions, religious and social guidance, pamphlets, broadsheets, news headlines, posters, notices and signs on trains, streetcars and buildings, and now television, YouTube videos, blogs, and Twittering (see Figures 1-1 and 1-2).

    Behind the public history of public health there is another, parallel, history reflecting three basic factors. The first factor that has played an important role in developing what we know as public health derives from the observational skills of individuals who were able to look at specific situations or events and ask questions that made a difference. The story of public health is marked by the impact of such people, many of whom were not physicians but hat makers, lawyers, soldiers, mathematicians, and other strange beasts. A second important factor has been the science and technology of the time. Medicine’s and the public’s responses to leprosy and polio, for example, have changed dramatically (and not always for the better) as our understanding of them improved and new interventions became available. Finally, public health has been shaped by the long, tedious, and unglamorous process of standardizing definitions, classifications, and actions. Over time, these three factors have supported each other in developing public health, and that process continues today. Many of the important aspects of public health began with observation and example during human history and predate written history. Linking survival to safety and health was an essential part of our evolutionary learning curve, as evidenced by the fact that we’re here to write this book (and you to read it). As children, when we wanted to put some unknown plant or a berry into our mouths, a parent would say Don’t eat that, it will make you sick! (Often followed by the injunction, If you kill yourself, you won’t get dessert tonight!). While the ancient societies that developed these food taboos did not leave written records, we can assume that learning what was harmful to eat came from the gained knowledge of those who had boldly gone where no one had gone before and gotten sick or died as a consequence. This is easy for items that cause immediate illness, but one can only imagine how much sickness and death occurred as humanity learned to process foods that are palatable, such as cassava or akee (Dufour, 1994), but can subsequently be very toxic.

    Early public health observations and practice extended far beyond dietary activities. One of the first described smallpox control practices is attributed to a Buddhist nun in the 1700’s, who scraped up and dried flakes of smallpox-encrusted pustules, pulverized them to dust, and blew the powder up susceptible people’s noses. This technique of stimulating protective immunity, known as variolation after the virus that causes smallpox, Variola major, was adopted in many centers in China, India, Turkey, and later in Europe. Voltaire said that Circassian women deliberately gave smallpox to their children, especially daughters, to protect their skin from disfiguring pox marks (Voltaire, 17 33/ 2004). The wife of the British Ambassador to Istanbul, Mary Wortley Montagu(e), who herself had suffered from smallpox, had her children inoculated in this way.

    As could be expected, there were a few postmarketing complications among the nasal recipients of smallpox dust. One was full-blown smallpox with death occurring in about 2 to 3% of the recipients. This was an improvement over the 20 to 30% mortality rate associated with the disease itself (Gross & Sepkowitz, 1998). With the variolation technique, the overall smallpox disease case-fatality rate is said to have fallen by 90%.

    Variolation, of course, is not the only example of community-based disease management practices of its time that had public health overtones. For hundreds of years, mothers throughout Asia were scraping the leading edges of the weeping ulcers of cutaneous leishmaniasis to inoculate their children’s buttocks to protect them against the cosmetically debilitating facial scarring of Oriental Ulcer.

    The next major step in smallpox control came from England and a physician named Edward Jenner (Riedel, 2005). Jenner observed that milkmaids rarely got smallpox or, if they did, it was very mild. He had also noted that milkmaids often had blisters on their hands similar to the skin lesions of smallpox. This was cowpox, or vaccinia (from the Latin for cow or vaca), acquired from the teats of the cows. In 1796, much as had been done previously by the Buddhist nun with Variola major and the mothers in Asia with Leishmania tropica, Jenner took the weeping, pusladen fluid from a cow’s teat blisters and vaccinated a young boy. Although the medical community resisted the technique of vaccination early on (starting a pattern of resistance to vaccination that persisted well into the 20th century), it rapidly achieved favor over variolation. The challenge was to maintain fresh vaccinia lesions to use as a source of inoculum material. In areas where the vaccinia virus was not available for protection against smallpox, such as in some areas of Africa and the Americas, the indigenous people suffered horribly from smallpox disease that had been introduced by Europeans during their periods of exploration, colonization, and conquest.

    DOGMA AND DISEASE -THE SOCIOLOGY OF PUBLIC HEALTH


    Many fledgling moralists in those days were going about our town proclaiming there was nothing to be done about it and we should bow to the inevitable.

    —Albert Camus, The Plague

    The basis of public health is trying to prevent or reduce future harm. Before people could record their history, these principles were passed orally from one generation to the next, explaining the risks of certain behaviors and activities. As social structures and religion evolved, many societies developed faith-based linkages between diet, cleanliness, decay, and decomposition and either health or social status. Beliefs about the positive health aspects of diet and exercise, and from the avoidance of some activities, behaviors, or objects exist many cultures, both ancient and modern, including the ancient Greeks and Egyptians, as well as the antiquarian and current Chinese, Indian, Arab, Jewish, and many aboriginal populations. While some beliefs may have served to maintain social exclusion and positions of power for the wealthy who could afford to live in cleaner or unfouled environments, they also provided a philosophy and examples that health could be affected by ones’ actions and environment. The fact that plagues, pestilence, and disease occurred more often among the poor, hungry, overcrowded, and poorly housed (what in public health would now be called health inequalities) (Commission on Social Determinants of Health, 2008) was often seen as the gods favoring the righteous or devoted. When disease did strike the penitent, it was often blamed on drifts from dogma.

    Over generations, the lessons learned by observations, bad experiences, and the instructions of teachers, prophets, and elders were codified. Rules of behavior in early human societies often included edicts that can be seen to have implications for public health, including protocols on dealing with the dead and cadavers. In Judaism, for example, members of the tribe of the high priests (the Kohanim), who are to remain pure so that they can bless others, are forbidden to touch a corpse or even to enter a cemetery. Similar codes of conduct were developed to deal with which animals can and cannot be eaten (e.g., what is kosher in Judaism or halal in Islam), and the approach to eating or using animals that were found dead or who died on their own (forbidden). It has been hypothesized by some (e.g., Macht, 1953) that these and other prohibited foods are more likely to carry disease, such as shellfish (Vibrio), pork products (Trichinella), and scavengers (everything else).

    The development of writing allowed people to record their history, which often included the diseases they came down with (at least those they survived). Public health themes are frequent and recurrent components of those records (Pack & Grand, 1948). Descriptions of plagues and epidemics, because they affected large populations, cities, or entire states, recur in ancient and classical history and these historical events continue to be referred to even in the modern public health literature (Morens et al., 2008).

    Waste and sanitation for small groups of people usually do not exceed the capacity of the environment to absorb or degrade it. Larger mobile communities can manage waste and garbage simply by moving to new hunting or gathering grounds. The development of static towns and cities, however, required methods to avoid being swamped in the detritus of human and animal existence. Sewers and drains have been seen in early cities such as Minoan society in Crete (Corrigan, 1932). Bathing, physical fitness, and sport were activities and attributes of wealthy Greek and Roman cultures. Greek cities had regulations dealing with keeping streets and fountains clean (Arnaoutoglou, 1998). Cleanliness was often associated with religious activities and could involve ritual preparations for devotion or rites, and practices that needed to be done after contamination or contact with evil, either in thought or deed (Oldenberg, 1993). Instructions regarding aspects of sanitation were provided by early religious texts. Deuteronomy 23:13 provides instructions on covering excreta:

    and you shall have a spade among your tools, and it shall be when you sit down outside, you shall dig with it and shall turn to cover up your excrement.

    The relationships between health, ritual, and religion have had a great impact on the sociological approaches to disease and illness, and some of those connections influenced how communities respond to sanitation and the risks—both real and perceived—posed by some infections. Clerical principles of cleanliness and purity could easily become transformed into an organized societal response to illness, particularly disfiguring, loathsome, or dirty diseases (Porter, 1999). Even before people understood the Germ Theory of disease, the belief that purity or religious status could be contaminated by contact with others of a particular occupation, status, or caste, was based on the concept of risk of transmission of adverse outcomes (Laungani, 2005). Not only did religious guidance recommend how some diseases were to be dealt with, but many religions also recommend or required duty to the poor and sick, resulting in clerical institutions becoming involved in health care. Linking religious practice and dogma to disease control are historically related to many aspects of modern disease control.

    Over time, the care of the ill by religious orders became associated with the development of what later became hospitals and hospices (Lewis, 2007). Some of these refuges for the sick also became involved in assisting those afflicted with a long-feared disease, leprosy. Because of its mistaken biblical associations (see Chapter 3), leprosy triggered a series of community actions sanctioned by the Christian church in an attempt to control the disease (Watts, 1997). Lepers were isolated, prevented from social contact, and housed or detained in specific institutions, known in Europe as leprosaria or lazarettos (Figure 1-3). The latter term owes its origin to Lazarus the Beggar, who it was believed suffered from leprosy (Luke 16: 19-31).

    FIGURE 1-3. A leprosarium (from Wellcome Trust).

    FIGURE 1-4. Coins issued by a leprosarium in Colombia.

    By the 13th and 14th centuries there were thousands of leprosaria in Europe, with Britain alone having more than 100 (Brothwell, 1958). As seen in Figure 1-4, some leprosaria even issued their own currency.

    As people reflexively turn to the tools and practices that they know, it is not surprising that when Europe was faced with the plague in the mid- 1300’s that isolation and separation of the sick and the exposed were used to try to control the disease. Lazarettos and leprosaria provided the model—and in some cases the facilities—in which those arriving from plague-affected regions were detained. Experiments in timing, which we assume involved waiting until everyone who was going to get sick and die had done so, led to determining that most commonly 40 days of isolation was sufficient (Clemow, 1929). Thus the process of quarantine was as both named and developed.

    With the exception of a small number of infectious diseases of serious public health importance for which quarantine and isolation may be required, the sociological nature of public health has turned full circle. Many modern population-based public health principles recognize that isolation, marginalization, and lack of social contact actually create more problems than they solve. Public health activities and policies are now more focused on eliminating the economic, social, and environmental disparities that are now known to affect health.

    BAD SMELLS, BODIES, AND BOARDS OF HEALTH: CONTROLLING DISEASE BECOMES A LEGISLATIVE


    The next bill was from the 23rd of May to the 30th, when the number of the plague was seventeen. But the burials in St Giles's were fifty-three—a frightful number!—of whom they set down but nine of the plague; but on an examination more strictly by the justices of peace, and at the Lord Mayor's request, it was found there were twenty more who were really dead of the plague in that parish, but had been set down of the spotted-fever or other distempers, besides others concealed.

    —Daniel Dafoe A Journal of the Plague Year

    The end of the medieval period was associated with many changes that affected the history of public health. Major epidemics, such as the plague, exceeded the abilities of religious orders to provide for the ill and destitute (see Figure 1-5). The number of cadavers that needed to be dealt with during the 17th century bubonic plague in Europe challenged existing capacities in many locations (Byrne, 2006). To maintain local services, many municipalities, states, and in some cases entire nations, became involved in assisting the poor and sick. Widespread poverty and illness were closely related to the development of social support systems. Military forces required large contingents of men with at least a modicum of good health (after all, as a king, you wouldn’t want to send out sick men to be killed, would you?), as did agriculture and other labor sectors. Maintaining economic and military services and capacities became more than a religious duty; it was a civic necessity. The ruling classes also knew that large numbers of sick, hungry, and unhoused people posed risks of social unrest that could threaten the political system (so much for charity as a noble endeavor).

    Much of the written history of public health has a European focus. Although many of the principles we recognize as core to public health were developed in Africa, Asia, and Europe, and shared among these regions, European scientific advances, coupled with imperial and colonial connections, provided much of the canvas on which the public health as we know it today was painted. The relative political stability of Britain, combined with the growth of scientific methods, resulted in several important developments in public health to be first observed, recorded, and then exported from there.

    Urbanization changed the character of many nations. Cities, with their large populations, had expanding linkages with other locations, increasing both the frequency and consequences of epidemic diseases. At the same time, the introduction of paper making, the development of the printing press, and greater educational opportunities provided tools that could be used to study health and the factors that affected it.

    FIGURE 1-5. Column in Vienna commemorating victims of the plague of 1605.

    The addition of some talented and inquisitive individuals who began to examine these relationships in more detail led to one of the major developments in public health—statistical analysis (although many students would not see this as a positive development).

    Church records routinely recorded births, marriages, and deaths. However, the more complex issues of taxation, managing debts and estates, and social support for the poor required additional methods of keeping track of who was where and how many of them there were. Parish death records, known in Britain as Bills of Mortality (Fig. 1-7), served as an early epidemiologic record of the impact of plague and other epidemics.

    fig1-6

    FIGURE 1-6. Plague doctor. Yale University, Harvey Cushing/John Hay Whitney Medical Library.

    fig1-7

    FIGURE 1-7. The collected Bills of Mortality from London, England, December 1664. Yale University, Harvey Cushing/John Hay Whitney Medical Library.

    In England, a scientifically inclined clothing merchant named John Graunt examined local Bills of Mortality to try to understand the deaths of children and the nature of epidemics like the plague. His work, entitled The Observations upon the Bills of Mortality and published in 1662, is recognized as one of the earliest applications of statistical analysis in the field of health. A physician colleague of Graunt, Sir William Petty, was also interested in how quantitative measures could be used to study economic and social activity. He called this political arithmetic (which is to be differentiated from what happened in Florida in the 2000 U.S. Presidential elections) and discussed it with Graunt and others. Looking backwards, it can be seen that together, the work of Graunt and Petty represent one of the important beginnings of the science of statistical analysis. Using public records to quantify the impact of death, health, and poverty provided tools that governments and individuals could use in their political and commercial activities.

    In Europe, cities became larger and industry moved from the cottage or guild to the factory. Greater urbanization and industrialization generated wealth that improved many lives, particularly those of the rich, as the poor remained fodder for the sweatshops. However, as larger and larger numbers of people moved to rapidly expanding urban areas, they quickly overwhelmed municipal services that were built for smaller populations. The results were urban congestion, frequent poor health, and bad social outcomes. The end of the long-standing worldwide conflict among England, France, and their allies also released large numbers of soldiers into a society that did not have jobs for many of them, further exacerbating the stress on existing social programs and services. Against this background of services and support systems that were less and less able to provide for those in dire need, the examples of what a disenfranchised population could do, as exemplified by the French Revolution, remained ever-present in the minds of those who governed. (There’s nothing like a few beheadings to clarify one’s mind.) Reducing the social and economic consequences of poverty and disease assumed increasing importance as the industrial revolution continued to change the face of many nations.

    In the 1830’s, Britain began to look at what would be now called social assistance programs through a review of the Poor Laws that had existed in one form or another since Elizabethan times. One of those involved in this process was the lawyer, Edwin Chadwick, who began to appreciate the relationships among and social consequences of poor sanitation and poverty. (He must have been one of the 2.3% of lawyers who are given a bad name by the other 97.7%.) In 1842, he published The Report from the Poor Law Commissioners on an Inquiry into the Sanitary Conditions of the Labouring Population of Great Britain. This document represents one of the earliest national analyses of the relationships among poverty, economic development, sanitation, and health, and is considered a milestone in the development of public health. The report was delivered at a time when the Germ Theory of disease was not yet widely accepted in Europe. Much of the scientific and medical thought of the time associated the presence of many diseases with miasmas, bad smells, and poor social graces (which many parents still believe accounts for the behavior of their children). As such, some of the assumptions, such as that health could be improved through better sanitation, were based on principles that were often incorrect. Despite this, improvements sometimes did occur. Cholera, while not a consequence of the bad odors and gases resulting from the lack of sewerage systems, was reduced nonetheless by the construction of effective sewers.

    In a situation that frequently recurs in the history of public health, both the political and social implications of Chadwick’s report were too much for the government of the time to deal with. Change had to await election of a new government in Britain, following which a Public Health Act was enacted in 1848, the first national legislation of its type (except for the U.S. legislation 50 years earlier, but we are being Euro-centric for the moment). As students of history know, 1848 was a period of great unrest and revolution in much of Europe, and attempts to improve the lot of the poor so that they would not threaten the status quo of the wealthy, were common. British legislation became a model of governance for many other nations and many of the principles can be seen today in national approaches to dealing with public health...and stand-up comedy. The passage of legislation however, does not necessarily lead to action, reinforcing the belief in the U.S. that the opposite of progress is Congress. In spite of the Public Health Act, real attention to cleaning up the sanitation of London did not start until a decade later, when a period of drought reduced the flow of the Thames River and its ability to clear sewage from the city. London suffered through what was called the Great Stink. Quickly, funds were allocated and an engineer, Joseph Bazalgette, began to build London’s sewer system (Haliday, 1999). His actions were copied in many places, including New York City.

    The frequent reference to British activities in public health does not mean that they were alone in the field. In France, Alexandre Parent-Duchatelet and Louis-Rene Villerme, studied and wrote about the health of the disadvantaged, incarcerated, and those who worked in unsafe environments. Villerme established a prototypical medical journal

    Enjoying the preview?
    Page 1 of 1