Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

The State of Health and Health Care in Mississippi
The State of Health and Health Care in Mississippi
The State of Health and Health Care in Mississippi
Ebook779 pages10 hours

The State of Health and Health Care in Mississippi

Rating: 0 out of 5 stars

()

Read preview

About this ebook

In this multidisciplinary book, the editor and contributors provide the most accurate and most recent information on health and health care in the State of Mississippi. They explain why the state finds itself in precarious health conditions and reveal the prevailing circumstances as the state debates a path toward a comprehensive health care system for its citizens. They show who has had access to good health care in the state and celebrate the heroes who struggled to provide health care to all Mississippians, and contribute to the debate on how the health care system might be restructured, reconstructed, or adjusted to meet the needs of all people in the state, regardless of race, ethnicity, socioeconomic status, and national origin.

The issue of health disparities and socio-economic status leads to a relevant discussion of whether health and access to quality care are a right of all people, as the United Nations has proclaimed, or the privilege of a few who have the economic resources and the political clout to purchase first-rate care. The volume offers a clear understanding of health care trends in the state since the inception of its health system during the eighteenth and early nineteenth centuries up to the present and the prospects of transcending the obstacles of its own creation over the past two centuries. It likewise highlights the economic challenges that Mississippi, like other states, confronts; and how wise and realistic its priorities are in meeting the needs of its diverse populations, particularly racial and ethnic minorities.
LanguageEnglish
Release dateJan 30, 2015
ISBN9781626743960
The State of Health and Health Care in Mississippi

Related to The State of Health and Health Care in Mississippi

Related ebooks

Medical For You

View More

Related articles

Reviews for The State of Health and Health Care in Mississippi

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    The State of Health and Health Care in Mississippi - University Press of Mississippi

    PART A

    INTRODUCTION

    A Thematic Approach to Health and Health Care

    Mario J. Azevedo, PhD, MPH, MA

    The Global Origins of Public Health and Its Place in the United States

    The following introduction is an overview of the evolution of the field of public health in terms of its global origins, definition and disciplines, popular and scientific acceptance, and impact on global health, paying special attention to the United States from the inception of its westward expansion to its colonial history and the present. The author hopes that this discussion sets the stage for the understanding of the state of health and health care in Mississippi, a state bedeviled by a history of discrimination and social inequalities and the apparent recalcitrant acceptance of the rapid scientific and educational advances clearly visible in the region that has been designated as the North American Continent. It is this lack of proactive vision and willingness and, at times, outright refusal to accept change that has set Mississippi apart from the 50 states that constitute the United States of America.

    When the terms health and health care are brought up in health literature and in ordinary and political conversation, two questions come to mind: What is health and how does a nation or state achieve the state of health for its citizens? This necessarily leads to the concepts of public health and medicine and their impact on people’s health and well-being. In this chapter, an attempt is made, first, to define or explain what health means and, second, to discuss the steps that Mississippi has taken over the centuries to ensure that all its citizens benefit from the medical and technological advances that have been made and the growing successes achieved by public health agencies and individuals committed to a better and healthier state. Some experts have called utopian and unachievable the United Nations’ definition of health as the state of total physical, social, and psychological well being and not simply the absence of disease. Writes Lawrence Gostin (2010, 1), professor at Georgetown University Law Center and Johns Hopkins University Bloomberg School of Public Health: Definitions of public health vary widely, ranging from the utopian conception of the World Health Organization of an ideal state of physical and mental health to a more concrete listing of public health practices.

    The United Nations’ definition speaks, in fact, to the goal and the ideal at which society should aim; indeed, one may not be afflicted by any disease as we define it but still lead a life of misery stemming from his/her physical or psychological condition. Would we call such a person healthy? In this context, experts in public health, just like many in other disciplines, sometimes disagree as to what public health means and what its functions might be. In reality, defining public health may not be that difficult and controversial. According to Aristotle and the followers of his thinking, such as Thomism espoused by Thomas Aquinas, defining is clarifying the genus (field of sociobiological sciences for public health) and the species (population-focus discipline or disciplines) of an object, entity, or a being, which is different from describing or analyzing its functions. Definitions are supposed to be clear, concise, and short, the distinctive elements that many definitions of public health lack. Consider, for example, the following definition, notwithstanding the fact that it was handed down to us by a famous epidemiologist, Charles-Edward A. Winslow, professor of public health at Yale University:

    The science and the art of preventing disease, prolonging life, and promoting physical health and efficiency through organized community efforts for the sanitation of the environment, the concept of community infections, the education of the individual in principles of personal hygiene, the organization of medical and nursing services for the early diagnosis and preventive treatment of disease, and the development of the social machinery which will ensure to every individual in the community a standard of living adequate for the maintenance of health. (Gostin 2010)

    This long and broadly encompassing statement is definitely not a definition but a general numeration of the functions that public health serves. In this context, the statement by the Institute of Medicine (IOM) seems to be more appropriate and to the point, as it states that public health, now considered a field with several disciplines, aims at fulfilling society’s interest in assuring conditions in which people can be healthy. Winslow’s all-encompassing characterization of public health led the now-famous professor of sociology at Princeton University and winner of the 1984 Pulitzer Prize in general nonfiction, Paul Starr, to criticize him by observing: So broad—and downright subversive—a conception, if taken seriously, is an invitation to conflict. Public health cannot make all these activities its own without, sooner or later, violating private beliefs or private property or the prerogatives of other institutions (Starr 1982, 180). In simpler words, once we agree on a definition of health, public health means the branch of science (and art) that focuses on the health of people as opposed to that of the individual, with the physical well-being of communities or populations as the ultimate goal rather than the well-being of a single person or patient. The latter is, in fact, what, in practice, medicine does.

    It is generally accepted that public health’s core functions are health assessment, policy development, and assurance. In this context Geoffrey Vickers, a British industrialist, accurately said some time ago: I believe that the history of Public Health might well be written as a record of successive re-definition of the unacceptable (see Schneider 2011, 11). Indeed, as a field that encompasses a series of disciplines in the medical and natural as well as the socio-behavioral sciences—epidemiology, biostatistics, behavioral and environmental health, health policy and management, maternal and child health, nutrition, and, more recently, global health—public health’s emphasis may shift and target the prevailing disease environment and health changes, contingent upon man’s scientific and technological advances and his ability to overcome obstacles to productive and people’s longer life expectancy at birth.

    It is clear, therefore, that medical practice and public health, even though related and dependent of each other, differ in terms of their target and focus. As Gostin notes, whereas medical (clinical) practice aims at curing the individual, public health targets populations or segments of populations, trying to anticipate, through assessment and surveillance, and prevent disease through scientific and behavioral strategies, such as appropriate policies. It diagnoses the state of a population and finds the causes, risk factors, and associations that disturb the health equilibrium of a population at a certain period of time. Unlike a physician, on the level of one patient, public health research collects, analyzes, and applies morbidity and mortality data on a population scale, making its findings available to the public; it helps develop policies that safeguard the health of a state, the nation, and the globe, with the ultimate goal that ensures that access to basic health services is fairly distributed within the community, regardless of race, religion, origin, sexual orientation, and socioeconomic status. It is, therefore, understandable, under these circumstances, that, from time to time, tension between public health and medical practice develops. As Starr aptly notes, while diagnosis and educational services were expanded [in the US and elsewhere] under public health auspices, treatment typically was left to private physicians. However, the line between the two fields was often difficult to draw. Did a series of diagnostic tests, followed by advice about hygiene and diet, constitute health education or the practice of medicine? (Starr 1982, 194).

    On the level of relevance and impact, it is also obvious that public health professionals cannot fulfill their mission and function without the committed assistance and power of enforcement and persuasion from the state, community organizations, and even corporations. This complex relationship has led to a confused understanding about what public health was in its years of early evolution and what it is now. Starr notes that the blurred distinction between preventive and curative medicine that followed public health in its formative years, during the nineteenth and early twentieth centuries, has been known as fragmentation.

    The Impact of Epidemics in North America: The United States

    The American colonial effort to establish a system that was later known as public health was a response to the hundreds of epidemics that periodically devastated Europe, which were often transported into the United States or threatened to cross the Atlantic barrier and infect the new settlements in Anglo-Saxon North America. Epidemics have been striking populations since immemorial times, beginning with Egypt (Africa), then spreading to Asia (India, China, and Palestine among the Hebrews), and Europe (Greece, Crete, and Rome), at the time when man led the life of a hunter before he evolved into a farmer of such crops as wheat, barley, and rice during the Neolithic Age (3,000 BC). As the history of man on the planet changed radically over the centuries, fighting household diseases through latrines and baths was always associated with sanitation (clearing the grounds off refuse and building underground sewage systems); with clean water and personal cleanliness; through town planning for better ventilation and comfort; and with provision of what we now might call primitive medical care, which included the administration of herbal treatment or home (folk) remedies, often associated with religious incantations, psychological manipulations, and primitive hospitals, as was the case in Egypt during the Middle Kingdom in 1700–2100 BC (Rosen 1993, 1). The evolution of medicine during the early days resulted in some sort of specialization. Slowly, the age of superstition gave rise to the age of reason, embracing the theory of natural disease causation beginning with the most celebrated physician in Western civilization, sometimes called the Father of Western medicine, Hippocrates of Greece (460–395 BC), founder of the clinical observation method. Hippocrates wrote of the three elements that he thought affected our health: water, air, and place (space) (Tulchinski and Varavikova 2000, 8).

    Thucydides (370–460 BC), Greek historian and renowned author from Alimos, wrote of a devastating unusual epidemic in Greece. The Romans had already suspected the adverse physical impact of lead (poisoning). The famous Roman historian Galen believed in the miasmic explanation of disease, based on the theory that diseases were caused by unhealthful fog or vapor rising from the ground. This theory began losing traction during the Renaissance (1500–1750), which, essentially propagated Hippocrates’s theories of disease etiology, basing medical assessment on the four humors of man (sanguine, phlegmatic, choleric, and melancholic) (Tulchinski and Varavikova 2000, 9). During the Middle Ages (1065–1350), people interpreted disease generally as the result of man’s sinfulness befallen upon him as punishment from God. This explanation was often associated with witchcraft. Only prayer and penance were advanced as the most effective remedies against people’s misbehavior on earth.

    The monasteries ran many of the hospitals, which, led by religious orders and monks, were primarily designed to help the sick and the poor, in an age when health conditions were extremely poor. At that time, 75 percent of the children would die before the age of five years mainly due to improper hygiene and sanitation and poor housing, all related to illiteracy and poverty. Leprosy, malaria, measles, scurvy, and smallpox outbreaks were commonplace. However, by the late Middle Ages, tell us sociologist Starr and many historians, public and private health had improved, and the first medical schools began to spring out of the new universities, especially after the decrees promulgated by Emperor Frederick II of Sicily, during the thirteenth century. These included licensing requirements; medical training consisting of three years of philosophy; five years of medical studies; and one year of supervised practice, followed by an examination and licensure. These were adopted by Spain in 1238 and Germany in 1347. Many of the medical schools became attached to the old and new universities, including Salermo (tenth century) the University of Paris (1100); Bologna (1158); Oxford (1167); Montpellier (1181); Cambridge (1209); Padua (1222); Toulouse (1233); Seville (1254); Prague (1348); Krakow (1364); Vienna (1365); Heidelburg (1386); Glasgow (1457); Basel (1460); and Copenhagen (1478) (Tulchinski and Varavikova 2000, 12). Medical science and new health techniques were enhanced by the impact of Islamic science, which spread through Western Europe when the Muslims invaded and successfully occupied parts of France, Spain, and Portugal, and made Cordova one of their major centers of Islamic learning and scientific pursuit.

    In 1346–1350, the Black Death (pneumonia-bubonic plague) was apparently introduced during the Mongol invasions from the steppes of Central Asia and through accelerated trade and commerce. It decimated some 24–30 million people or one-third of Europe’s population during those four years. Subsequent plagues, spanning the seventeenth through the nineteenth centuries, devastated such cities as Paris, Marseilles, London, and Moscow, and parts of India. The plague was so virulent that cities such as Novgorod banned public funerals, leading to the banning of trade in Russia by Czar Boris Godunov during the seventeenth century. In England, the enclosure movement was partly a response to the frequent deadly plague outbreaks. However, the measures that drastically and finally stimulated improvement of the health conditions in Europe were the impact of the meticulous recording of mortality rates and their causes by John Gaunt in 1662, which he titled Natural and Political Observations upon the Bills of Mortality, considered to be the first statistical and systematic approach to the study of disease; the discovery of the microscope by Antony van Leeuwenhoek in 1676, strengthening the theory of the germ as the etiology of infectious disease; the advent of the modern hospital and its bureaucracy, which began to emerge in Europe after 1750, all of which the first colonizers transplanted to the New World; emphasis on public sanitation and individual cleanliness; notions of surgery; Edward Jenner’s work with vaccine against smallpox (which, incidentally, almost decimated the entire Inca population in South America) in 1796; John Snow’s observational study of two Thames River’s water companies in London, which he had suspected of being the source of a deadly cholera epidemic among the residents using its pumped water; and the germ and bacteriological theory advanced by Louis Pasteur in Paris in the 1850s-1870s, culminating in Robert Koch’s success in culturing the tubercle bacillus associated with cholera in 1882.

    Dispensing from the earliest epidemics of the ancient world, including the bubonic plague that hit the Roman Empire in AD 592, as mentioned before, we may recall here and revisit the famous 1346–1350 Bubonic Plague or the Black Death, already noted, responsible for the death of over 100 million people during the following hundred years (see History of Health, www.relfe.com/history_1.html). In 1557, a devastating influenza outbreak hit Europe, and, a subsequent one, in 1563, killed 20,000 people in London alone. In 1667, the first clearly recorded smallpox epidemic and dysentery outbreak spread throughout Europe, followed by a malaria epidemic in England in 1675, leading to the acceptance of the quinine. Quinine is the product of the bark of a tree made known to the Spaniards by Peru Indians in 1638. It turned out to be the best medication against the disease. In North America, yellow fever appeared in Philadelphia in 1699, followed by another in 1702, which spread much wider on the continent (1793), followed by another in 1762, and several other episodes of influenza in New England in 1793. Tuberculosis became also a major curse, with an accompanying death rate of 700 per 100,000 in 1812, forcing the United States to start a rigorous vaccination program in 1880. A typhoid outbreak killed 25 people in Bridgewater, Connecticut, and 20 in Concord, New Hampshire (a vaccination campaign against typhoid started in the US in 1911). While the South usually followed the North on strategies designed to contain and eradicate diseases, especially the infectious type, the North followed the advances and the treatments adopted in Western Europe, especially in England and France.

    In Mississippi, hookworm disease was common, particularly among the African slaves who carried the parasite. The hookworm hatched in the fields where slaves worked. Beriberi, a vitamin B deficiency disease, was a common paralytic disease in colonial America. Acute bacillary dysentery (bloodied stool, occurring at least three times a day), and typhoid also caused havoc in the whole South, killing in the process many people. Besides causing death, typhoid usually manifests itself in a prolonged burning fever, is debilitating, and causes death more often than not, occurring more often in the hot months of the year. In North America, this disease was spotted first in Virginia by Reverend Robert Hunt. It is said that typhoid killed more soldiers in combat, such as the American Civil War, in places like Vicksburg, than any other disease, sometimes causing a serious shortage of men to fight (Benenson 1984, 1–12). Smallpox, caused by the variola virus, was very common in Mississippi. It is an extremely contagious disease, which leaves scabs and scars in the victims’ face and other parts of the body. The colonies tried to prevent infectious diseases from spreading by isolating, quarantining, and inoculating people. Of course, since 1977, in our era, smallpox, just as poliomyelitis around most of the globe, is no longer with us, including the developing world (except in Pakistan and northern Nigeria), but could one day reemerge, just as polio threatens some developing countries to strike again due to inadequate preventive measures, such as children’s immunization. All over the globe, the eradication of smallpox has been heralded as one of the most notable accomplishments of modern public health.

    Yellow fever was rampant in Mississippi and other parts of the South and the Southeast. Its debilitating symptoms (chills and fevers) affect the victim within a week and kill over 50 percent of its victims. Yellow fever seems to have hit Philadelphia for the first time in 1693, apparently brought in from Barbados. Outbreaks of yellow fever were common in New York, Philadelphia, New Hampshire, and in such southern colonies as Texas, Florida, and Mississippi, in areas along the Mississippi River, and in the city of St. Louis, Missouri. In fact, Philadelphia, capital of the United States since 1780, was abandoned in 1800, partly because of its vulnerability to frequent yellow fever outbreaks. Like in ancient times, when people ran away from lepers, during the yellow fever outbreaks, which saw the number of deaths and burials rise, neighbors isolated themselves from their neighbors, closed their doors to strangers, and avoided the crowd at the workplace. Malaria, caused by a parasite carried by the female mosquito anopheles falciparum, was another deadly disease that afflicted the first European and African inhabitants of North America. Apparently, the path of the disease seems to have been Maryland, probably from such busy ports as Baltimore, through Georgia, Alabama, and Florida, and inland to Ohio and Missouri and the Gulf of Mexico.

    In 1933, encephalitis showed its ugly face in St. Louis, Missouri, spreading much later, in Mississippi, during the 1970s. Throughout the eighteenth and nineteenth centuries, the continued suspicion of the harmful effect of smallpox vaccine and alleged deaths from it seem to have led to a slow down of its use in Europe and North America, as a movement against it grew on both sides of the Atlantic. In 1871, for example, a special Privy Council Committee in England was called to review the Vaccination Act of 1867 because, allegedly, some 97.5 percent of those inoculated against smallpox, it was claimed, had died. In 1938, 58 physicians signed a letter against mandated vaccination against diphtheria because the disease had virtually disappeared in Sweden where no vaccination program had been adopted. More recently, the last scare from compulsory flu vaccine occurred in the administration of President Gerald Ford in 1976. The controversy over the effects of vaccines, of course, is currently still a topic of conversation in the United States. Mississippi followed closely what was happening in the rest of the country, particularly in the north, and adopted the recommended preventive steps, even though poverty always curtailed much of what the state could do to protect its citizens adequately against the frequent outbreaks of malaria, yellow fever, smallpox, diphtheria, syphilis, and encephalitis, the latter being caused by the West Nile virus, with residents of the swampy areas of the state being more vulnerable.

    War, Medicine, and Public Health

    Throughout history, the treatment of war casualties and victims has taught societies many medical lessons—not that war is needed for the advancement of medical and public health science and practice. However, just as we learned from the Crimean War (1852–1854) and the work of Frances Nightingale, which revolutionized the way hospitals are run, so did Mississippi learn from the major war scenes, such as the siege of Vicksburg, as the Confederate army defended the city once it had come under attack from Union troops under General Ulysses Grant in 1862–1863. This topic leads us to a discussion of the epidemiology of injury and the spread of infectious diseases, using the Vicksburg siege as an example. Throughout this famous siege, the disease conditions in Vicksburg were so dire that, at one point, one side would have as many as one-third of its soldiers ill (as happened in November 1862 in the Confederate army), rendering them unable to continue to fight. Sometimes, about half on either side would be debilitated by disease. The illnesses were usually the same, malaria and diarrhea (chronic, acute, and dysentery), and typhoid, and, at times, yellow fever, smallpox, and measles. Diarrhea was usually caused by contaminated water and food.

    Ordinarily, while diarrhea and malaria made up half the sick and wounded, the other half consisted of a large number of injuries and illnesses (Freemon 1991, 433), prompting one expert to write that About half of the deaths from disease during the Civil War were caused by intestinal disorders, mainly typhoid fever, diarrhea, and dysentery, followed by pneumonia, tuberculosis, chickenpox, mumps, whooping cough, and scurvy. Measles became a major concern for the Union army, affecting, at one point in November 1862, 1 percent of the troops. As noted above, smallpox was also common, sometimes peaking on the Confederate side and then moving to the Union ranks and vice versa. During the siege of Vicksburg, malaria attacks struck one million soldiers from both sides. Thus, wrote soldier Osborn Oldroyd in his diary at Vicksburg, after half of his companions had been decimated by diseases and the war: Only half of our company is left now, and after two years more, what will have become of the rest? (Retrieved June 25, 2011, from http://www.cw-chrocles.com/blog/category/asoldier%E2%80%99s-story-of-the-siege-of . . . ).

    Typhoid fever seems to have taken its toll on the combatants of both sides as well, accounting for the deaths of one-quarter of the noncombatants in the Confederacy, salmonella bacteria being the major culprit. The major killers, typhoid fever, diarrhea, pneumonia, and bad diet, were exacerbated by the swampy area around Vicksburg, a breeding ground for malaria and yellow fever mosquitoes. Lack of hygiene and short supplies of food and balanced nutrition became a concern on both sides, although more so for the Confederate side. One student of the siege wrote that there was so much famine during the siege that soldiers were invading civilian farms and eating almost any animal they saw moving in the streets: As the siege wore on [he wrote], fewer and fewer horses, mules, and dogs were seen wandering about Vicksburg, and even shoe leather became a last resort of sustenance for many adults (Korn 1985, 149–52). The stench of soldiers’ corpses, who at times were left to rot for three days in the Mississippi heat before being buried, was horrific, as the burial grounds could not hold so many so fast. The attending doctors tried their best, but the wounded and the sick often overwhelmed the health facilities. The bloodletting treatment, of course, and the amputation of limbs were the most common surgery procedures for many of the wounded. The estimated number of soldiers wounded at their extremities seems to have been 175,000 each, on both sides, which resulted in 30,000 amputees, with a copious use of chloroform to reduce the pain. Thus, we learn that soldiers were heard screaming desperately not from the amputation pains but from the frightening news that their limbs would be amputated.

    The operations took place in crowded and unclean makeshift facilities or field hospitals behind the front lines. The operating instruments were not sterilized, as this method of preventing infections was unknown, and, with the scarcity of water, surgeons operated on so many wounded without ever washing their hands (Retrieved June 25, 2011, from http://www.cw-chronicles.com/blog/category/asoldier%E2%/080%/99s/story-of-the-siege-of . . . ). The frequent amputations, the carelessness of the surgeons, often resulting from the great number of patients, and the unhygienic treatment of the infected, as well as the overcrowding of the field hospitals, helped spread the diseases almost immediately after treatment, resulting in gangrenes, staphylococcus cases and streptococcus pyrogenes infections, while pus oozed from the wounds of the just-treated patients. Sadly, the soldiers knew that many of their after-surgery illnesses and suffering came from the operations, which they called surgical fevers (http://civilwarhome.com/civilwarmedicine.htm). Interestingly, it appears that 75 percent of the amputees survived. Historical records of the Civil War reveal that, altogether, from 1861 to 1865, the medical officers who served along the troops numbered 4,000 for the Confederacy and 13,000 for the Union, the latter being assisted by 4,000 nurses.

    It is important to note for this chapter that, despite the awful circumstances surrounding treatment, the physicians and the surgeons, as well as the nurses, and all other medical personnel learned invaluable lessons from the Civil War, both in combating disease and in operating injuries. The efficacy of the quinine, used sparingly against malaria because supplies were often not sufficient, and vaccine against infectious diseases showed the medical personnel methods and treatments that could be improved in a quieter setting—in civilian hospitals and health care centers. Quoting the Encyclopedia of the Civil War on the medical lessons learned toward the improvement in health care:

    Throughout the war, both the South and the North struggled to improve the level of medical care given to their men. In many ways, their efforts assisted in the birth of modern medicine in the United States. More complete records on medical and surgical activities were kept during the war than ever before, doctors became more adept at surgery and at the use of anesthesia, and perhaps most importantly, a greater understanding of the relationship between cleanliness, diet, and disease was gained not only by the medical establishment but by the public at large. Another important advance took place in the field of nursing, where respect for the role of women in medicine rose considerably among both doctors and patients. (Retrieved June 25, 2011, from http://www.civilwarhome.com/civilwarmedicine.htm.)

    Interestingly, addressing the impact of the disease factor on the siege of Vicksburg, some historians and public health experts have advanced the theory that As the siege [of Vicksburg] progressed, the Confederate Army grew more ill and lost a greater proportion of its fighting force. It seems reasonable [continues Freemon] to conclude that medical care affected the course and possibly the outcome of the Vicksburg Campaign (Freemon 1991, 438). Obviously, the two Worlds Wars also must have had a tremendous impact on health and the conduct of medical practice in America. Suffice to point out a few important developments from these two Great Wars (1914–1918 and 1939–1945, respectively) that also affected Mississippi.

    According to the experts, World War I (1914–1918) forced the Allies to improve their administrative operations in health, appreciate the importance of hygiene and sanitation, and modify or enhance their surgical advances. Health services on the battlefield and at home had to be increased by 20-fold, compelling the US to deploy 29,602 physicians as reserve officers; streamline and better organize doctors for the effective and quick treatment of soldiers; listing some as physicians in charge of wounds; creating fracture clinics in general hospitals by 1917; and performing reconstructive surgery of bones and joints, while at the same time enhancing their practice with an accelerated and improved usage of X-rays. In England, at the urging of Prime Minister Lloyd George, public health and health services received a major boost when health centers became widespread, while the emphasis shifted to the need to bring preventing and curative medicine together (Bennett 1990, 741). Antiseptic practices to treat wounds were upgraded, and only specific physicians with the necessary training and experience in the substitution of hypertonic salt solutions for antiseptic, auto-inoculations, vaccines, etc. could provide such services, accompanied by techniques for the cleansing and removal of debris and loose devitalized tissues from the wound (Bennett 1990, 739).

    This resulted in complete agreement among the armies and their governments on the need for complete excision of dead, badly damaged or grossly infected tissue for the patient. Hygiene continued to be refined with better ways of purifying water, disposal of waste, field sanitation, and the administration of vaccine against tetanus, typhoid, and paratyphoid ailments. Overall, the number of deaths from typhoid decreased by 90 percent, while claiming 30 to 51 times more among the nonvaccinated on the battlefield than the vaccinated. We may recall the historical fact that typhoid fever killed more soldiers in the Crimean War (1852–1854) than the war itself. Among the British soldiers, who numbered some 1,200,000 troops, typhoid cases fell to 7,423 and caused only 266 casualties. Blood transfusion, an indispensable medical feature of our times, was used for the first time in World War I (1914–1918); trench fever, caused by lice infection, was discovered and treated; trench foot, whose symptoms were cold swollen red feet, rendering them numb and blistered, was resolved through dry boots, anti-frostbite grease, and foot powder; and gout was studied and treated more effectively. All improvements in health brought about by the harsh and deadly conditions of the war contributed to concrete health results, including much lower infant mortality rates in Europe and America.

    World War II (1939–1945) unwittingly helped perfect the techniques and improved the discoveries brought about by the treatment of wounds and illnesses and the deaths of soldiers who fought in the war. The use of M+B in adequate dosages, produced for the first time in great quantities by pharmaceuticals for treatment against sore throat, pneumonia, and gonorrhea, of which many soldiers suffered, saved many lives. Alexander Fleming had already discovered penicillin, but companies were urged to produce it on a larger scale, which, by 1945, had become 20 times more potent and more effective and could be administered faster than ever before. Burn and skin graft centers, blood transfusions, according to one writer, became a sophisticated well-oiled machine at the end of the war to treat the wounded soldiers. The change included blood storage and distribution and massive inoculations against tetanus started in WWI. All this proceeded at an accelerated pace. It was also during the Second World War that the defense against chemical warfare brought about the development of face and head gas masks against gas poison. The earnest research against mosquito bites was also a legacy of the last of the two Great Wars. Thus, sometimes man’s actions are able to draw good from the consequences of evil by meeting challenges head-on, either voluntarily or involuntarily. As for Mississippi, in all, 157,607 men and women eventually registered for this Second World War, 75,977 whites and 81,548 blacks and, by the end of the war, some 19,296 whites and 24,066 blacks had been inducted into the military. Soldiers brought with them many lessons from the war.

    Pre-Twentieth-Century Medical Practices and Public Health in America

    Prior to the nineteenth century, medicine and public health in America were completely underdeveloped fields. The best cure, if it was not the proven power of plants and the concoction of herbs, powder, and other exotic cocktails, was bloodletting, since there was no accurate understanding of disease etiology claiming convincingly that bacteria, germs, and viruses had much to do with infections. Quite often, in the process of bloodletting, people lost so much blood in trying to purge the body from the harmful presence of unexplainable physical conditions that they bled to death. As someone wrote of colonial life and cure, No medical college existed in the colonies before the Revolution. The practice of blood letting for almost any disease was universal; and if the physician was not at hand, this was done by the barber, the clergymen, or any medical amateur. As a result, back then, anyone could claim to be a doctor; witches made themselves doctors, while governors became teachers claiming that they were physicians; and quacks, and self-made healers, went around knocking on doors to cure the sick and received a few dollars for their work. Those who knew something about medicine had been apprentices during a short period of time. As we know, there was no distinction between physicians and surgeons. Many were still claiming, like the English physician Nicholas Culpeper (1613–1654), who blended herbalism with a touch of alchemy, that they could cure cases of lustfulness, melancholy, intemperate dreams, vipers, serpents, mad dogs, and the plague (Colonial Gazette 2011).

    Culpeper spoke and wrote of the potent healing effect of adders tongue (leaf juices taken with distilled water) against breast wounds and bowels; angelica for the heart; arrach wild and stinking against womb problems; prickly asparagus, sparagus, or sperage, boiled in white wine against kidney stones; wood-betony against epidemic disease and witchcraft; celandine against liver gall obstruction and yellow fever; cowslips against skin wrinkles; eyebright herb in white wine for eye problems; foxglove to heal fresh wounds; garlic against females’ problems; hemlock against body swelling; mint to repress the milk in women’s breast; rosemary for stomach disease; sage, helping to expel the dead child; wood sorrel to fight ulcers; melancholy thistle, expelling melancholy from a person; and vipers bugloss against serpent venom. Throughout the seventeenth century, most households practiced home medicine and took care of their sick family members by following what was commonly thought to be curative at the time. Popular, for example, was the medical treatment consisting of toads burned to a crisp and powdered, then taken in small doses for diseases of the blood.

    A summary of two articles, one by Lewallan and the other by Hitti on bloodletting and purging, at-times called depletion in the US—appearing in 2001 and 2004, respectively—note that this medical practice had become the panacea or the state-of-the-art of all disease cures from the early colonial days up to the late eighteenth century. Indeed, many experts believe that George Washington died on December 14, 1794, from over bleeding when treated by Benjamin Rush, MD, a 1768 graduate of the School of Medicine at Edinburgh. Rush had drained from the ex-president’s body some 80 ounces of blood after he had caught what is thought to have been acute pneumonia, which seems to have the effect of sleet and snow as he rode a horse on his farm. Rush believed that blood depletion was the best remedy to fight all kinds of illnesses, especially infectious diseases, such as yellow fever epidemics, which were very common in Philadelphia. He taught his more than 3,000 students and followers that bloodletting should be required every time there was an unusual [or slow] pulse, or when stool and vomit, and, sometimes, cold hands and feet afflicted a patient. Rush, whom the English William Cobbett pamphleteer sarcastically called the Pennsylvania Hippocrates, was such a believer of the theory of bloodletting that he invoked God and his sense of patriotism against his opponents. Rush once told his listeners:

    By the proximate cause of fever I have attempted to prove that the inflammatory state of fever depends upon morbid and excessive action in the blood-vessels. It is connected, of course, with preternatural sensibility in their muscular fibers. The blood is one of the most powerful stimuli which act upon them. By attracting a part of it, we lessen the principal cause of fever. The effect of bloodletting is immediate and natural in removing fever, as the abstraction of a particle of sand is to cure and inflammation of the eye, when it arises from that cause. (See Lewallan 2001, 1)

    Rush warned his opponents that God had predestined that illnesses be treated that way and advised his fellow physicians to disassociate themselves from European practices and idiosyncrasies. Thus, when his own family members were fatally struck by yellow fever in 1794, while he survived immune from the epidemic, Rush interpreted the contrast as a miracle. In the course of his lifetime as a physician, Rush is said to have administered more bloodletting than any other known physician of his era. For example, from a lady by the name of Mrs. Fries, Rush drained eight bleedings of 30 ounces altogether in 17 hours, the induced bleedings eventually numbering 15 in a week. He caused one patient to bleed 100 and another 114 ounces in five days. Despite the frequency of the bloodlettings, Philadelphia was hit by unprecedented rates of deaths from yellow fever during the 1790s because the frequency and volume of blood removed from patients either resulted in death or chronic morbidity (Lewallan 2001, 2) that led to more deaths.

    This intrepid defender of bloodletting, simultaneously a very influential member of the College of Physicians at the time, preached that his treatment provided physicians the ability to choose the time and place of extraction of bad blood that the body would naturally discharge through other orifices via diarrhea, vomiting, and menstruation, as quoted by the same author above (2001, 2). As a physician, Rush and his followers favored cure and treatment, the true responsibility of the physician, over emphasis on pathology, laboratories, and experimentation. But his stance and arrogance provoked such adverse reaction from his colleagues that he was denounced for the deaths bloodletting seemed to have caused and ridiculed him in public. Cobbett compared him to an ugly old hag that despised beauty, while characterizing his attitude as the most insolent pretension to superiority ever set by mortal man. The attacks on his position and personality were such that Rush sued his enemies, the only way the vitriolic attacks eventually stopped.

    Looking at the dispute over bloodletting from hindsight and in light of the medical advances achieved on infectious diseases, Hitti concludes that currently there might be cases where bloodletting can be a remedy, like with staphylococcus aureus (staph). Staph apparently thrives on heme iron in the human body, and bloodletting may be the treatment (Hitti 2004, 1). However, in another interesting study, Lewallan (2001, 1) stresses that Rush was more wrong than right; bleeding did not cure the body: It merely reduced the black color and relaxed the brain. Obviously, without blood for the body to discharge [he adds], there were no crucial white blood cells and platelets to fight the disease. Interestingly, in many villages of the developing world, especially in Africa, bloodletting, with the blood drawn by the traditional healer through a small (hollow) horn or horn-shaped device, is still a popular method of treatment. The healer makes an incision in the patient’s target area, usually the leg, and initially inhales the blood to cause it to flow, without, however, swallowing it, just as someone might start drawing gasoline from a vehicle gas tank using a hose.

    Native Americans, notably, the Choctaw in Mississippi, were usually considered to be well versed in diagnosis and treatment of diseases. Colonists admired the Choctaw people and at times attributed their knowledge and medical power to God who had endowed them with knowledge and land with so many curative plants. Some settlers even advocated marriages with American Indians to learn and practice their medical secret power. In this context, mulattoes descending from Indians and Europeans were at times feared and asked to share their secrets or help treat incurable diseases. However, at that time, between the Middle Ages and the discovery of the virus during the nineteenth century, the soundest remedies against influenza, diphtheria, tuberculosis, and gastrointestinal infections still remained sanitation, hygiene, and isolation, as in ancient Egypt in Africa and Greece and Rome in Europe. Isolation in the form of quarantine (a word derived from the Latin number 40, adopted once 30 days were considered not long enough) was still the major frontline defense against epidemics, which forced the infected person to live almost incommunicado.

    References

    Benenson, Abraham S. 1984, January–February. Immunization and Military Medicine. Reviews of Infectious Diseases 6(1): 1–12.

    Bennett, J. D. 1990. Medical Advances Consequent to the Great War 1914–1918. Journal of the Royal Society of Medicine 83(11): 738–42.

    Bloodletting’s Benefits. 2013. Retrieved December 13, 2011, from http://men.webmed.com/news/20040910/bloodlettings-benefits?printe=true.

    Civil War Medical Notes. n.d. Retrieved June 25, 2001, from http://www.civilwarhome.com/civilwarmedicine.htm.

    Colonial Gazette. Retrieved June 18, 2011, www.mayflowerfamilies.com/enquirer/nicholas_culpeper.html.

    Freemon, Frank R. 1991, September–October. Care at the Siege of Vicksburg, 1863. Bulletin of the New York Academy of Medicine 67(5): 429–38.

    Gostin, Lawrence O. 2010. Law and Ethics: A Reader. Los Angeles: University of California Press, Center for Law and the Public’s Health.

    The History of Health. 1994. Retrieved June 9, 2011, from www.relfe.com/history_1.html.

    Hitti, Miranda. 2004. Bloodletting’s Benefits: Ancient Medical Practice Worked, Study Shows. WebMed Health News.

    Korn, Jerry. 1985. War on the Mississippi: Grant’s Vicksburg Campaign. Alexandria, VA: Time-Life Books.

    Lewallan, James B. 2001. Dr. Benjamin Rush: Progressive Patriot in the War against Yellow Fever Epidemic of the 1790s. Retrieved October 25, 2001, from http://ww4.samford.edu/schools/artsc/lewallan.htm.

    Rosen, George. 1993. A History of Public Health. New York: Johns Hopkins University Press.

    Schneider, Mary-Jane. 2011. Introduction to Public Health. 3rd ed. Boston: Jones and Bartlett Publishers.

    A Soldier’s Story of the Siege of Vicksburg. 1885. Retrieved June 25, 2011, from http://www.cw-chronicles.com/blog/category/asoldier%E2%/080%/99s/story-of-the-siege-of . . . .

    Starr, Paul. 1982. The Social Transformation of American Medicine: The Rise of a Sovereign Profession and the Making of a Vast Industry. 3rd ed. Philadelphia: University of Pennsylvania Press, Basic Books.

    Tulchinsky, Theodore, and Elena Varavikova. 2000. The New Public Health. New York: Academic Press.

    Chapter 1

    A HISTORY OF HEALTH AND HEALTH CARE IN MISSISSIPPI

    Mario J. Azevedo, PhD, MPH, MA

    Introduction: Poverty, Race, and Health Disparities in Mississippi

    The State of Mississippi was originally explored and colonized by the Spaniards, and consecutively by the French, and the British from the sixteenth to the eighteenth centuries. It became the Territory of Mississippi by an act of Congress in 1798 and the twentieth state of the United States in 1817. Currently, Mississippi’s surface area is 46,907 square miles and has approximately 2,967,297 people, of whom 1,636,272 live in rural areas (USDA-ERS). The state’s largest cities are Jackson, Gulfport, Hattiesburg, and Southhaven. The Census Bureau estimates that 59.1 percent of Mississippi’s population is Caucasian, 37.0 percent African American, and 2.7 percent Hispanic/Latino. In terms of health facilities, Mississippi has 99 hospitals, 72 of which are located in rural areas (North Carolina Rural Health Research and Policy Analysis Center December 2008). While the state has 27 hospitals identified by the Flex Monitoring Team as Critical Access Hospitals, rural health clinics number 167 (Kaiser Family Foundation 2011). Some 21 federally qualified health centers provide services at 170 sites. As an attempt at providing more efficient health care to its citizens, health officials have divided the state into nine public health districts: District 1 (northwest Mississippi); District 2 (northeast Mississippi); District 3 (Delta counties); District 4 (Tombigbee area); District 5 (west-central Mississippi); District 6 (east-central Mississippi); District 7 (southwest Mississippi); District 8 (southeast Mississippi); and District 9 (Gulf Coast).

    Studies conducted on poverty and ill health and on health and poverty have shown an association or the likelihood of association between the two. In their study of this issue, Eastwood and Lipton (1999) noted that the association between poverty and ill-health reflects causality running in both directions. The World Health Organization (WHO) called this relationship a vicious circle, whereby poverty breeds ill-health, ill-health maintains poverty (Wagstaff, 2002). Most experts point out that one of the primary reasons for ill health and poverty or both are socioeconomic inequalities or disparities that our society continues to allow between population segments, sometimes aggravated by the preference for one race or ethnicity over the other. Thus, McCally (2004, 1) put it:

    A growing body of research confirms the existence of a powerful connection between socio-economic status and health . . . Absolute poverty, which implies a lack of resources deemed necessary for survival, is self-evidently associated with poor health, particularly in less developed countries . . . Strong evidence now indicates that relative poverty, which is defined in relation to the average resources available in a society, is also a major determinant of health in industrialized countries [such as the US].

    In 2008, the Centers for Disease Control and Prevention (CDC) in partnership with Schools of Public Health was quite clear on its statement on health and poverty that:

    Health may be considered a multi-sectoral issue, involving access to care and services, transportation, health insurance of some type, education, individual and family well-being, housing, and community-level issues such as neighborhood safety. Concomitantly, many health problems are exacerbated by the poverty that impacts family and community well-being. While poverty is associated with increased risk for multiple adverse health outcomes, it is typically not directly addressed in public health interventions. Similarly, whereas micro-enterprise is a fairly widespread approach to poverty alleviation, it is not generally considered a public health intervention. (CDC Public Health Reports 2008, 9)

    The CDC has also made it known on HIV/AIDS, for example, that poverty has much to do with the chances of the poor getting infected and being in ill health. Declared the CDC about heterosexual individuals in January 2010: Heterosexuals living below the poverty line in US cities are five times as likely as the nation’s general population to be HIV-positive, regardless of their race or ethnicity (Maugh II 2010, 1).

    Most of the studies conducted during the past decades point to income inequalities or disparities as precursors and factors most responsible for the existence of abject or absolute poverty. Thus, for example, low income in a large population may be the result of lack of employment or underemployment, often contributing to ill health and poverty, aggravated by lack of adequate education; nonavailability of clean drinking water; inadequate health insurance coverage; unsanitary conditions; long travel time or distance to local primary health care facilities; and absence of essential drugs. Several studies have demonstrated that household income is inexorably linked to health costs, health disparities, and, ultimately, poverty, and is probably a larger cause of impoverishment than out-of-pocket payments for health services (WHO 2002, 98). These conditions, in turn, adversely affect people’s ability to afford and enjoy the necessities of life, namely, shelter, food, clothing, and health care, which can be strengthened by an acceptable infrastructure and tamed adverse geographical factors. For centuries, income distribution in the household has favored the male, one of the reasons why women and children, in general, have tended to be more disadvantaged economically and more susceptible to ill health. Indeed, higher income is associated with more frequent and more intensive use of health services in both the private and public sectors. On lack of insurance coverage, the same study shows, as noted above, that the poor, who are the most price-sensitive users of health services, frequently face a higher price at the point of use because they are less likely to have insurance coverage, whether private or public (Norton et al. 2000). Experience has also shown that illness tends to perpetuate absenteeism, which may result either in firings or reduced income earnings for the individual or the family, making both less able to make ends meet. Thus, many sociologists and public health professionals believe that eliminating socioeconomic disparities would contribute to better health, hence the importance of targeted enlightened national, regional, and state policies.

    Further, as Krieg et al. (2003, 312) note, the disparity gap in health is wide and the net effect has been to remove from view—and from policy discourse—the pervasive patterning of US health disparities by a socio-economic position within and across racial/ethnic groups, as well as to retard understanding of the contribution of economic and non-economic aspects of racial discrimination of US racial/ethnic disparities. Obviously, the concept of poverty, which is a relative term associated with country and class, must be defined appropriately. The United Nations and the World Bank define a poor person in the developing world as someone who makes less than $2.00 a day. The 1990 US Census defined a poor household as a couple and two children making only $12,647 a year or $34,417 for a family of 9 (1999). Applying this definition, Mississippi’s 14 of the 82 counties, as shown below, fit the poverty criterion established by the census: Bolivar (30.1 percent); Claiborne (29.2 percent); Coahoma (30.6 percent); Holmes (33.7 percent); Humphreys (32.7 percent); Issaquena (34.7 percent); Jefferson (29.6 percent); Leflore (31.6 percent); Quitman (30.0 percent); Sharkey (32.0 percent); Sunflower (34.3 percent); Washington (29.5 percent); Wilkinson (30.8 percent); and Yazoo (29.1 percent). Greenville, a major city in the state, has a poverty rate of 29.6 percent.

    From its inception, the State of Mississippi knew that poverty and racial discrimination had and would continue to have a negative impact on black people’s access to health care. It was no secret, therefore, even to segregationists, that legal separation accentuated the health disparities in the state. Blacks had to travel far and be treated at black hospitals only, even if they lived closer to one that was built for whites. In the counties, the average distance to the nearest black hospital was about 50 miles. Prior to the civil rights movement, black hospitals totaled only 11 in the state and included Dumas Infirmary in Natchez; Afro-American Sons and Daughters Hospital in Yazoo City; Colored Hospital in Lexington City; Dr. Miller’s Hospital in Yazoo City; King’s Daughter Hospital in Greenville city; Mound Bayou Community Hospital in Mound Bayou city; Taborian Hospital in Mound Bayou City; Plantation Hospital in Scott City; Rosedale Hospital in Rosedale City; Yazoo Clinic Hospital in Yazoo City; and Greenwood Colored Hospital in Greenwood City. It often happened that, by the time the patient arrived at the hospital, his condition had deteriorated and was admitted at a health facility that had fewer resources and staff than a hospital reserved for whites. A 2008 study, conducted by Zheng and Zhou on the impact of distance to the hospital for treatment of injuries from vehicle accidents for black Mississippians, concluded:

    On the average, distance to nearest hospital fell by 50 miles for blacks after integration. We also show that distance and accident mortality were positively correlated: Increases in distance to the nearest hospital were associated with higher mortality. Combining the treatment effects of distance with integration, we conclude that hospital integration reduced African-American mortality from car accidents by 15 percent. (Zheng and Zhou 2008, 1)

    Studies of health disparities also point to the prevalent racial attitudes and misconstrued scientific theories of white physicians as having contributed to the poor health conditions of blacks in the South. While some whites attributed the various ailments among blacks to genetic susceptibility and insensitivity to pain, others claimed that beliefs in voodoo in the black community made blacks distrustful of the science associated with medicine. As one prominent journalist and student of ethics has written recently:

    Blacks have been forced to undergo painful, risky experimental surgery, dosed with radiation and singled out for experiments aimed at finding brain abnormalities linked to violence. They have been falsely assumed to feel less pain than whites and to require higher X-ray doses for a readable film. (Grady 2007, 1)

    In Mississippi and other southern states, apparently there were instances where black women were sterilized without their knowledge during surgery, a practice so common, it is alleged, that it was known as the Mississippi appendectomy (New York Times, www.nytimes.com/2007/01/23/health/23book.html). Of course, the 1932–1972 Tuskegee study of 400 black men with syphilis, purposely kept untreated to find its pathogenesis and progression, rendered similar stories the more believable, especially within the black community, in conditions journalist Denise Grady has called medical apartheid (2002). Others were quick to claim that black people would never adhere to hygienic practices, considered even then as one of the best defenses against disease morbidity and mortality.

    To understand this situation in the antebellum South, including Mississippi, one must go back to the time of slavery and study whites’ perceptions of the conditions of black America. Because of distrust and negative racial attitudes, most physicians refused to treat slaves without the payment being made in cash, even though many were used for medical experiments. As a result of lack of access to treatment, the rate of mortality in the black population was high. After the Civil War (1861–1865), for example, if it were not for the Freedmen’s Bureau created by the federal government, the rate of mortality among blacks would have been even higher than before. The overall death rate fell from 30 percent in 1865 to 2.03 percent in 1869 (Charatz-Litt 1992, 71). Undoubtedly, during slavery, slave masters were keen on ensuring that their enslaved servants would remain in good health to be productive. They therefore sought the assistance of physicians of the era to ensure at least minimal care for their slaves.

    This has led some to conclude that blacks were well cared for, overlooking the fact that physicians were rare at the time; that those who practiced medicine did not want to treat black people, if they could help it; and that no black doctors were available, often forcing the masters themselves to treat their own chattel property. Indeed, one physician of the time wrote, as he awkwardly blamed a black person as being a victim of his or her own lifestyle: "His diet is fatty; he revels in fat; every pore of his sleek, contented face reeks [sic] with unctuousness. To him, the force-producing quality of the fats has the seductive fascination that opium leaves about the Oriental" (quoted by Charatz-Litt

    Enjoying the preview?
    Page 1 of 1