Epidemiology PDF
Document Details
Uploaded by VirtuousXenon
Liceo de Cagayan University
Tags
Summary
This document provides an introduction to epidemiology, discussing how diseases are distributed in populations and the factors that influence their development. It explores objectives of epidemiology including identifying disease causes and risk factors, measuring disease burden, and evaluating prevention/treatment strategies. Also looks at changing patterns of community health problems.
Full Transcript
What Is Epidemiology? Epidemiology is the study of how disease is distributed in populations and the factors that influence or determine this distribution. Why does a disease develop in some people and not in others? The premise underlying epidemiology is that disease, illness, ill health, and excel...
What Is Epidemiology? Epidemiology is the study of how disease is distributed in populations and the factors that influence or determine this distribution. Why does a disease develop in some people and not in others? The premise underlying epidemiology is that disease, illness, ill health, and excellent health status are not randomly distributed in human populations. Rather, each of us has certain characteristics that predispose us to, or protect us against, a variety of different diseases. These characteristics may be primarily genetic in origin, the result of exposure to certain environmental hazards, or the behaviors (good and bad) that we engage in. Perhaps most often, we are dealing with an interaction of genetic, environmental, and behavioral and social factors in the development of disease. A broader definition of epidemiology than that given previously has been widely accepted. It defines epidemiology as "the study of the distribution and determinants of health-related states or events in specified populations and the application of this study to control of health problems."' What is noteworthy about this definition is that it includes both a description of the content of the discipline and why epidemiologic investigations are carried out. Objectives of Epidemiology What are the specific objectives of epidemiology? First, to identify the etiology, or cause, of a disease and its relevant risk factors (i.e., factors that increase a person's risk for a disease). We want to know how the disease is transmitted from one person to another or from a nonhuman reservoir to a human population or why it arises due to risk behaviors the person engages in. Our 2 ultimate aim is to intervene to reduce morbidity and mortality from the disease. We want to develop a rational basis for prevention programs. If we can identify the etiologic or causal factors for disease and reduce or eliminate exposure to those factors, we can develop a basis for prevention programs. In addition, we can develop appropriate vaccines and treatments, which can prevent the transmission of the disease to others. The second objective of epidemiology is to determine the extent of disease found in the com-munity. What is the burden of disease in the com-munity? This question is critical for planning health services and facilities and for estimating how many future health care providers should be trained. A third objective is to study the natural history and prognosis of disease. Clearly, certain diseases are more severe than others; some may be rapidly lethal, whereas others may have extended durations of survival. Many diseases are not fatal but may affect quality of life or be associated with disability. We want to define the baseline natural history of a disease in quantitative terms so that as we develop new modes of intervention, either through treatments or through new ways of preventing complications, we can compare the results of using these new modalities with the baseline data to determine whether our new approaches have truly been effective. Fourth, we use epidemiology to evaluate both existing and newly developed preventive and therapeutic measures and modes of health care delivery. For example, does screening men for prostate cancer using the prostate-specific antigen (PSA) test improve survival in people found to have prostate cancer? Has the growth of managed care and other new systems of health care delivery and health care insurance had an impact on the health outcomes of the patients involved and on their quality of life? If so, what has been the nature of this impact and how can it be measured? Finally, epidemiology can provide the foundation for developing public policy relating to environmental problems, genetic issues, and other social and behavioral considerations regarding disease prevention and health promotion. For example, is the electromagnetic radiation that is emitted by cell phones, electric blankets and heating pads, and other household appliances a hazard to human health? Are high levels of atmospheric ozone or particulate matter a cause of adverse acute or chronic health effects in human populations? Is radon in homes a significant risk to human beings? Which occupations are associated with increased risks of disease in workers, and what types of regulation are required to reduce these risks? CHANGING PATTERNS OF COMMUNITY HEALTH PROBLEMS A major role of epidemiology is to provide clues to changes that take place over time in the health problems presenting in the community. Fig. 1.1 shows a sign in a cemetery in Dudley, England, in 1839. At that time, cholera was the major cause of death in England; the churchyard was so full that no burials of persons who died of cholera would henceforth be permitted. The sign conveys an idea of the importance of cholera in the public's consciousness and in the spectrum of public health problems in the early 19th century. Clearly, cholera is no longer a major problem in the United States today, but in many low-income and war-torn countries of the world it remains a serious threat, with many countries periodically reporting outbreaks of cholera that are characterized by high death rates, often as a result of inadequate or inaccessible medical care. Let us compare the major causes of death in the United States in 1900 and 2014 (Fig. 1.2). The categories of causes have been color coded as described in the caption for this figure. In 1900 the leading causes of death were pneumonia and influenza, followed by tuberculosis and diarrhea and enteritis. In 2014 the leading causes of death were heart disease, cancer, chronic lower respiratory diseases, and unintentional injuries. What change has occurred? During the 20th century there was a dramatic shift in the causes of death in the United States. In 1900 the three leading causes of death were infectious diseases; however, now we are dealing with chronic diseases that in most situations are not communicable or infectious in origin. Consequently, the kinds of research, intervention, and services we need today differ from those that were needed in the United States in 1900. The pattern of disease occurrence currently seen in developing countries is often similar to that which was seen in the United States in 1900: infectious diseases remain the leading causes of death. However, as countries become industrialized they increasingly manifest the mortality patterns currently seen in developed countries, with mortality from chronic diseases becoming the major challenge (this is commonly referred to as the "epidemiologic transition"). However, even in industrialized countries, as human immunodeficiency virus (HIV) infection has emerged and the incidence of tuberculosis has increased, infectious diseases are again becoming major public health problems. Table 1.1 shows the 10 leading causes of death in the United States in 2014. The three leading causes—heart disease, cancer, and chronic lower respiratory diseases—account for almost 55% of all deaths, an observation that suggests specific targets for prevention if a significant reduction in mortality is to be achieved. Another demonstration of changes that have taken place over time is seen in Fig. 1.3, which shows the remaining years of expected life in the United States at birth and at age 65 years for the years 1900, 1950, and 2014, by race and sex. The number of years of life remaining after birth has dramatically increased in all of these groups, with most of the improvement having occurred from 1900 to 1950 and much less having occurred since 1950. If we look at the remaining years of life at age 65 years, very little improvement is seen from 1900 to 2014. What primarily accounts for the increase in remaining years of life at birth are the decreases in infant mortality and in mortality from childhood diseases. In terms of diseases that afflict adults, especially those 65 years and older, we have been much less successful in extending the span of life, and this remains a major challenge. Epidemiology and Prevention A major use of epidemiologic evidence is to identify subgroups in the population who are at high risk for disease. Why should we identify such high-risk groups? First, if we can identify these high-risk groups, we can direct preventive efforts, such as screening programs for early disease detection, to populations who may not have been screened before and are most likely to benefit from any interventions that are developed for the disease. In sub-Saharan Africa, targeted HIV counseling and testing to men who are not aware of their status can effectively reduce epidemics if they are linked to care, started on antiretroviral therapy, and continued in care. Second, if we can identify such groups, we may be able to identify the specific factors or characteristics that put them at high risk and then try to modify those factors. It is important to keep in mind that such risk factors may be of two types. Characteristics such as age, sex, and race, for example, are not modifiable, although they may permit us to identify high- risk groups. On the other hand, characteristics such as obesity, smoking, diet, sexual practices, and other lifestyle factors may be potentially modifiable and may thus provide an opportunity to develop and introduce new prevention programs aimed at reducing or changing specific exposures or risk factors. PRIMARY, SECONDARY, AND TERTIARY PREVENTION In discussing prevention, it is helpful to distinguish among primary, secondary, and tertiary prevention (Table 1.2). Primary prevention denotes an action taken to prevent the development of a disease in a person who is well and does not (yet) have the disease in question. For example, we can immunize a person against certain diseases so that the disease never develops or, if a disease is environmentally induced, we can prevent a person's exposure to the environmental factor involved and thereby prevent the development of the disease. Primary prevention is our ultimate goal. For example, we know that most lung cancers are preventable. If we can help to stop people from ever smoking, we can eliminate 80% to 90% of lung cancer in human beings. However, although our aim is to prevent diseases from occurring in human populations, for many diseases, such as prostate cancer and Alzheimer disease, we do not yet have the biologic, clinical, or epidemiologic data on which to base effective primary prevention programs. Secondary prevention involves identifying people in whom a disease process has already begun but who have not yet developed clinical signs and symptoms of the illness. This period in the natural history of a disease is called the preclinical phase of the illness and is discussed in Chapter 18. Once a person develops clinical signs or symptoms it is generally assumed that under ideal conditions the person will seek and obtain medical advice. Our objective with secondary prevention is to detect the disease earlier than it would have been detected with usual care. By detecting the disease at an early stage in its natural history, often through screening, it is hoped that treatment will be easier and/ or more effective. For example, most cases of breast cancer in older women can be detected through mam-mography. Several recent studies indicate that routine testing of the stool for occult blood can detect treatable colon cancer early in its natural history but colonoscopy is a better test, although far more expensive and invasive. The rationale for secondary prevention is that if we can identify disease earlier in its natural history than would ordinarily occur, intervention measures may be more effective and life prolonged. Perhaps we can prevent mortality or complications of the disease and use less invasive or less costly treatment to do so. Evaluating screening for disease and the place of such intervention in the framework of disease prevention are discussed in Chapter 18. Tertiary prevention denotes preventing complications in those who have already developed signs and symptoms of an illness and have been diagnosed (i.e., people who are in the clinical phase of their illness). This is generally achieved through prompt and appropriate treatment of the illness combined with ancillary approaches such as physical therapy that are designed to prevent complications such as joint contractures. TWO APPROACHES TO PREVENTION: A DIFFERENT VIEW Two possible approaches to prevention are a population-based approach and a high-risk approach? In the population- based approach, a preventive measure is widely applied to an entire population. For example, prudent dietary advice for preventing coronary disease or advice against smoking may be provided to an entire population using mass media and other health education approaches. An alternate approach is to target a high-risk group with the preventive measure. Thus screening for cholesterol in children might be restricted to children from high-risk families. Clearly, a measure applied to an entire population must be relatively inexpensive and noninvasive. A measure that is to be applied to a high-risk subgroup of the population may be more expensive and may be more invasive or inconvenient but also has to be able to correctly identify individuals with the disease. More on screening tests is discussed in Chapter 18. Population- based approaches can be considered public health approaches, whereas high-risk approaches more often require a clinical action to identity the high-risk group to be targeted. In most situations, a combination of both approaches is ideal. Often a high-risk approach, such as prevention coun-seling, is limited to brief encounters with physicians. These approaches are discussed further in Chapter 19. Epidemiology and Clinical Practice Epidemiology is critical not only to public health but also to clinical practice. The practice of medicine is dependent on population data. For example, if a physician hears an apical systolic murmur, a heart sound produced when blood flows across the heart valves, how does he or she know whether it represents mitral regurgitation? Where did this knowledge originate? The diagnosis is based on correlation of the clinical findings (such as the auscultatory findings-sounds heard using a stethoscope) with the findings of surgical pathology or autopsy and with the results of echocardiography, magnetic resonance, or catheterization studies in a large group of patients. Thus the process of diagnosis is population based (see Chapter 5). The same holds for prognosis. For example, a patient asks his physician, "How long do I have to live, doctor?" and the doctor replies, "Six months to a year." On what basis does the physician prognosticate? He or she does so on the basis of experience with large groups of patients who have had the same disease, were observed at the same stage of disease, and received the same treatment. Again, prognostication is based on population data (see Chapter 6). Finally, selection of appropriate therapy is also population based. Randomized clinical trials that study the effects of a treatment in large groups of patients are the ideal means (the so-called gold standard) for identifying appropriate therapy (see Chapters 10 and 11). Thus population- based concepts and data underlie the critical processes of clinical practice, including diagnosis, prognostication, and selection of therapy. In effect, the physician applies a population-based probability model to the patient on the examining table. Fig. 1.4 shows a physician demonstrating that the practice of clinical medicine relies heavily on population concepts. What is portrayed humorously here is a true commentary on one aspect of pediatric practice—a pediatrician often makes a diagnosis based on what the parent tells him or her over the telephone and on what he or she knows about which illnesses, such as viral and bacterial infections, are "going around" in the community. Thus the data available about illness in the community can be very helpful in suggesting a diagnosis, even if they are not conclusive. Data regarding the etiology of sore throats according to a child's age are particularly relevant (Fig. 1.5). If the infection occurs early in life, it is likely to be viral in origin. If it occurs at ages 4 to 7 years, it is likely to be streptococcal in origin. In an older child, Mycoplasma becomes more typical. Although these data do not make the diagnosis, they do provide the physician or other health care provider with a good clue as to what agent or agents to suspect. Epidemiologic Approach How does the epidemiologist proceed to identify the cause of a disease? Epidemiologic reasoning is a mul-tistep process. The first step is to determine whether an association exists between exposure to a factor (e.g., an environmental agent) or a characteristic of a person (e.g., an increased serum cholesterol level) and the presence of the disease in question. We do this by studying the characteristics of groups and the characteristics of individuals. If we find there is indeed an association between an exposure and a disease, is it necessarily a causal relationship? No, not all associations are causal. The second step therefore is to try to derive appropriate inferences about a possible causal relationship from the patterns of the associations that have been previously found. These steps are discussed in detail Chapter 14. Epidemiology often begins with descriptive data. For example, Fig. 1.6 shows rates of gonorrhea in the United States in 2015 by state. Clearly, there are marked regional variations in reported cases of gonorrhea. The first question to ask when we see such differences between two groups or two regions or at two different times is, "Are these differences real?" In other words, are the data from each area of comparable quality? Before we try to interpret the data, we should be satisfied that the data are valid. If the differences are real, then we ask, "Why have these differences occurred?" Are there differences in potential exposures between high-risk and low-risk areas, or are there differences in the people who live in those areas? This is where epidemiology begins its investigation. Many years ago, it was observed that communities in which the natural level of fluoride in the drinking water varied also differed in the frequency of dental caries in the permanent teeth of residents. Communities that had low natural fluoride levels had high levels of caries, and communities that had higher levels of fluoride in their drinking water had low levels of caries (Fig. 1.7). This finding suggested that fluoride might be an effective prevention intervention if it were artificially added to the drinking water supply. A trial was therefore carried out to test the hypothesis. Although, ideally, we would like to randomize a group of people either to receive fluoride or to receive no fluoride, this was not possible to do with drinking water because each community generally shares a common water supply. Consequently, two similar communities in upstate New York, Kingston and Newburgh, were chosen for the trial. The DMF index, a count of decayed, missing, and filled teeth, was used. Baseline data were collected in both cities, and at the start of the study, the DMF indices were comparable in each age group in the two communities. The water in Newburgh was then fluoridated, and the children were reexamined a decade later. Fig. 1.8 shows that, in each age group, the DMF index in Newburgh had dropped significantly 10 years or so later, whereas in Kingston, there was no change. This is strongly suggestive evidence that fluoride was preventing caries. It was possible to go one step further in trying to demonstrate a causal relationship between fluoride ingestion and low rates of caries. The issue of fluoridating water supplies has been extremely contro-versial, and in certain communities in which water has been fluoridated, there have been referenda to stop the fluoridation. It was therefore possible to look at the DMF index in communities such as Antigo, Wisconsin, in which fluoride had been added to its water supply and then, after a referendum, fluoridation had been stopped. As seen in Fig. 1.9, after the fluoride was removed, the DMF index rose. This provided yet a further piece of evidence that fluoride acted to prevent dental caries. From Observations to Preventive Actions In this section, three examples from history are discussed that demonstrate how epidemiologic observations have led to effective preventive measures in human populations. IGNÁZ SEMMELWEIS AND CHILDBED FEVER Ignáz Semmelweis (Fig. 1.10) was born in 1818 and began as a student of law until he left his studies to pursue medical training. He specialized in obstetrics and became interested in a major clinical and public health problem of the day: childbed fever, also known as puerperal fever (the word "puerperal" means related to childbirth or to the period after the birth). In the early 19th century, childbed fever was a major cause of death among women shortly after childbirth, with mortality rates from childbed fever as high as 25%. Many theories of the cause of childbed fever were popular at the time, including atmospheric toxins, "epidemic constitutions" of some women, putrid air, or solar and magnetic influences. This period was a time of growing interest in pathologic anatomy. Because the cause of childbed fever remained a mystery, great interest arose in associating the findings at autopsies of women who had died of the disease with the clinical manifestations that characterized them while ill after childbirth. Semmelweis was placed in charge of the First Obstetrical Clinic of the Allgemeine Krankenhaus (General Hospital) in Vienna in July 1846. At that time there were two obstetrical clinics, the First and the Second. Pregnant women were admitted for childbirth to the First Clinic or to the Second Clinic on an alternating 24-hour basis. The First Clinic was staffed by physicians and medical students and the Second Clinic by midwives. Physicians and medical students began their days performing autopsies on women who had died from childbed fever; they then proceeded to provide clinical care for women hospitalized in the First Clinic for childbirth. The midwives staffing the Second Clinic did not perform autopsies. Semmelweis had been impressed by mortality rates in the two clinics in 1842 (Fig. 1.11). Mortality in the First Clinic was more than twice as high as in the Second Clinic—16% compared with 7%. Semmelweis surmised that mortality was higher in the First Clinic than in the Second because the physicians and medical students went directly from the autopsies to their patients. Many of the women in labor had multiple examinations by physicians and by medical students learning obstetrics. Often these manual examinations traumatized the tissues of the vagina and uterus. Semmelweis suggested that the hands of physicians and medical students were transmitting disease-causing particles from the cadavers to the women who were about to deliver. His suspicions were confirmed in 1847 when his friend and colleague Jakob Kolletschka died from an infection contracted when he was accidentally punctured with a medical student's knite while performing an autopsy. The autopsy on Kolletschka showed pathology very similar to that of the women who were dying from childbed fever. Semmelweis concluded that physicians and medical students were carrying the infection from the autopsy room to the patients in the First Clinic and that this accounted for the high mortality rates from childbed fever in the First Clinic. Mortality rates in the Second Clinic remained low because the midwives who staffed the Second Clinic had no contact with the autopsy room. Semmelweis then developed and implemented a policy for the physicians and medical students in the First Clinic, a policy designed to prevent childbed fever. He required the physicians and medical students in the First Clinic to wash their hands and to brush under their fingernails after they had finished the autopsies and before they came in contact with any of the patients. As seen in Fig. 1.12, in 1848, mortality in the First Clinic dropped from 12.2% to 2.4%, a rate comparable to that seen in the Second Clinic for the same year. When Semmelweis was later replaced by an obstetrician who did not subscribe to Semmelweis's theories, and who therefore eliminated the policy of required hand-washing, mortality rates from childbed fever rose again in the First Clinic-further evidence supporting a causal relationship. Unfortunately, for many years Semmelweis refused to present his findings at major meetings or to submit written reports of his studies to medical journals. His failure to provide supporting scientific evidence was at least partially responsible for the failure of the medical community to accept his hypothesis of causation of childbed fever and his further proposed intervention of handwashing before examining each patient. Among other factors that fostered resistance to his proposal was the reluctance of physicians to accept the conclusion that by transmitting the agent responsible for childbed fever, they had been inadvertently responsible for the deaths of large numbers of women. In addition, physicians claimed that washing their hands before seeing each patient would be too time consuming. Another major factor is that Semmelweis was, to say the least, undiplomatic and had alienated many senior figures in medicine. As a consequence of all of these factors, many years passed before a policy of handwashing was broadly adopted. An excellent biography of Semmelweis by Sherwin Nuland was published in 20033 The lessons of this story for successful policy making are still relevant today to the challenge of enhancing both public and professional acceptance of evidence-based prevention policies. These lessons include the need for clearly presenting supporting scientific evidence for a proposed intervention, the need for implementation of the proposed intervention to be perceived as teasible and cost-effective, and the need to lay the necessary groundwork for the policy, including garnering professional as well as community and political support. Years later, the major cause of childbed fever was recognized to be a streptococcal intection. Semmelweis's major findings and recommendations ultimately had worldwide effects on the practice of medicine. Amaz-ingly, his observations and suggested interventions preceded any knowledge of germ theory and thus proved that it is possible to implement a prevention strategy even when the exact cause of the disease is not known. However, it is also of interest that, although the need for handwashing has now been universally accepted, recent studies have reported that many physicians in hospitals in the United States and in other developed countries still fail to wash their hands. EDWARD JENNER AND SMALLPOX Edward Jenner (Fig. 1.13) was born in 1749 and became very interested in the problem of smallpox, which was a worldwide scourge. For example, in the late 18th century, 400,000 people died from smallpox each year and one-third of survivors were blinded as a result of corneal infections. It was known that those who survived smallpox were subsequently immune to the disease, and consequently it became a common preventive practice to infect healthy individuals with smallpox by administering to them material taken from smallpox patients, a procedure called variola-tion. However, this was not the optimal method: some variolated individuals died from the resulting smallpox, infected others with smallpox, or developed other infections. Jenner was interested in finding a better, safer approach to preventing smallpox. He observed, as had other people before him, that dairy maids, the young women whose occupation was milking cows, developed a mild disease called cowpox. Later, during smallpox outbreaks, smallpox appeared not to develop in these young women. In 1768 Jenner heard a claim from a dairy maid, "I can't take the smallpox for I have already had the cowpox." These data were observations and were not based on any rigorous study, but Jenner became convinced that cowpox could protect against smallpox and decided to test his hypothesis. Fig. 1.14 shows a painting by Gaston Melingue of Edward Jenner performing the first vaccination in 1796. (The term "vaccination" is derived from vacca, the Latin word for "cow.") In this painting, a dairy maid, Sarah Nelmes, is bandaging her hand after just having had some cowpox material removed. The cowpox material is being administered by Jenner to an 8-year-old "volunteer," James Phipps. Jenner was so convinced that cowpox would be protective that 6 weeks later, to test his conviction, he inoculated the child with material that had just been taken from a smallpox pustule. The child did not contract the disease. We shall not deal in this chapter with the ethical issues and implications of this experiment. (Clearly, Jenner did not have to justify his study before an institutional review board!) In any event, the results of the first vaccination and of what followed eventually saved literally millions of human beings throughout the world from disability and death caused by the scourge of smallpox. The important point is that Jenner knew nothing about viruses and nothing about the biology of the disease. He operated purely on observational data that provided him with the basis for a preventive intervention. In 1967 the World Health Organization (WHO) began international efforts to eradicate smallpox using vaccinations with vaccinia virus (cowpox). It has been estimated that, until that time, smallpox afflicted 15 million people annually throughout the world, of whom 2 million died and millions of others were left blind or disfigured. In 1980 the WHO certified that smallpox had been eradicated. The smallpox eradication program,* directed at the time by Dr. D.A. Henderson (Fig. 1.15), is one of the greatest disease prevention achievements in human history. The WHO estimated that 350 million new cases had been prevented over a 20-year period. However, after the terrorist attacks that killed nearly 3,000 people in the World Trade Center in New York City on September 11, 2001, worldwide concern developed about potential bioterrorism in the wake of the 2001 anthrax attacks. Ironically, the possibility that smallpox virus might be used for such a purpose reopened issues regarding smallpox and vaccination that many thought had been permanently relegated to history by the successful efforts at eradication of the disease. The magnitude of the smallpox bioterrorism threat, together with issues of vaccinia risk-both to those vaccinated and to those coming in contact with vaccinees, especially in hospital environments are among many that have had to be addressed. However, often only limited or equivocal data are available on these issues to guide the development of relevant public health prevention policy relating to a potential bioterrorism threat of using smallpox as a weapon. JOHN SNOW AND CHOLERA Another example of the translation of epidemiologic observations into public policy immortalized John Snow, whose portrait is seen in Fig. 1.16. Snow lived in the 19th century and was well known as the anesthesiologist who administered chloroform to Queen Victoria during childbirth. Snow's true love, however, was the epidemiology of cholera, a disease that was a major problem in England in the middle of the 19th century. In the first week of September 1854, approximately 600 people living within a few blocks of the Broad Street pump in London died of cholera. At that time, the Registrar General was William Farr. Snow and Farr had a major disagreement about the cause of cholera. Farr adhered to what was called the miasmatic theory of disease. According to this theory, which was commonly held at the time, disease was transmitted by a miasm, or cloud, that clung low on the surface of the earth. If this were so, we would expect that people who lived at lower altitudes would be at greater risk of contracting a disease transmitted by this cloud than those living at higher elevations. Farr collected data to support his hypothesis (Table 1.4). The data are quite consistent with his hypothesis: the lower the elevation, the higher the mortality rate from cholera. Snow did not agree; he believed that cholera was transmitted through contaminated water (Fig. 1.17). In London at that time, water was obtained by signing up with one of the water supply companies. The intakes for the water companies were in a very polluted part of the Thames River. At one point in time, one of the companies, the Lambeth Company, for technical, non-health-related reasons, shifted its water intake upstream in the Thames to a less polluted part of the river; the other companies did not move the locations of their water intakes. Snow reasoned therefore that based on his hypothesis that contaminated water caused cholera, the mortality rate from cholera would be lower in people getting their water from the Lambeth Company than in those obtaining their water from the other companies. He carried out what we currently call "shoe-leather epidemiology" -going from house to house, counting all deaths from cholera in each house, and determining which company supplied water to each house. Snow's findings are shown in Table 1.5. The table shows the number of houses, the number of deaths from cholera, and the deaths per 10,000 houses. Although this is not an ideal way to rate, because a house can contain different numbers of people, it is not a bad approximation. We see that in houses served by the Southwark and Vauxhall Company, which obtained its water from a polluted part of the Thames, the death rate was 315 deaths per 10,000 houses. In homes supplied by the Lambeth Company, which had relocated its water intake upstream, the rate was only 38 deaths per 10,000 houses. His data were so convincing that they led Farr, the Registrar General, to require the registrar of each district in south London to record associated with cholera was based entirely on observational data.' The point is that, although it is extremely important for us to maximize our knowledge of the biology and pathogenesis of disease, it is not always necessary to know every detail of the possible pathogenic mechanisms to prevent disease. For example, we know that virtually every case of rheumatic fever and rheumatic heart disease followed a streptococcal infection. Though Streptococcus has been studied and analyzed extensively, we still do not know how and why it causes rheumatic fever. We do know that after a severe streptococcal infection, as seen in military recruits, rheumatic fever does not develop in 97 of every 100 infected persons. In civilian populations, such as schoolchildren, in whom the infection is less severe, rheumatic fever develops in only 3 of every 1,000 infected schoolchildren but not in the remaining 997.' Why does the disease not develop in those 97 recruits and 997 schoolchildren if they are exposed to the same organism? We do not know. Is the illness the result of an undetected difference in the organism, or is it caused by a cofactor that may facilitate the adherence of streptococci to epithelial cells? What we do know is that, even without fully understanding the chain of pathogenesis from infection with Streptococcus to rheumatic fever, we can prevent virtually every case of rheumatic fever if we either prevent or promptly and adequately treat streptococcal infections, as has been the case in the United States. The absence of biologic knowledge about pathogenesis should not be a hindrance or an excuse for not implementing effective preventive services. Consider cigarette smoking and lung cancer. We do not know what specific carcinogenic agents in cigarettes cause cancer, but we do know that more than 80% of cases of lung cancer are caused by smoking. That does not mean that we should not be conducting laboratory research to better understand how cigarettes cause cancer. But again, in parallel with that research, we should be mounting effective community and public health programs to discourage smoking based on available observational data. Fig. 1.18 shows mortality data for breast cancer and lung cancer in women in the United States. Breast cancer mortality rates remained relatively constant over several decades but showed evidence of decline in the early years of the 21st century. However, mortality from lung cancer in women has been increasing steadily, although it may have begun to stabilize, and even decrease slightly, in recent years. Since 1987, more women in the United States have died each year from lung cancer than from breast cancer. Thus we are faced with the tragic picture of a largely preventable form of cancer, lung cancer, which results from a personal habit, smoking, as the current leading cause of cancer death in American women. Furthermore, in 1993, environmental tobacco smoke (secondhand smoke from other people's smoking) was classified as a known human carcinogen by the Environmental Protection Agency, which attributed about 3,000 lung cancer deaths in nonsmoking individuals each year to environmental tobacco smoke. When the Frequency of a Disease Declines, Who Deserves the Credit? Over the past hundred or so years, mortality rates from a number of common infectious diseases have declined in the United States. For example, deaths from childhood infections such as diphtheria, pertussis (whooping cough), and scarlet fever (a streptococcal infection) have declined dramatically. In addition, US deaths from tuberculosis have dropped significantly. It would be tempting to link these declines to improvements in treatments or vaccines that became available for these diseases during this time. However, in 1971 Edward Kass published the graphs shown in Fig. 1.19? These graphs demonstrate that for each of these diseases, the major decline in mortality occurred many years before any effective treatment or vaccine became available. Fig. 1.20 shows a similar presentation of mortality trends over time for rheumatic fever in the 20th century.® Clearly, most of the decline in rheumatic fever mortality occurred well before penicillin and other antistreptococcal treatments became available. What can explain these dramatic declines even before any vaccine or treatment became available? Theoretically, it is possible that when we observe a decline in mortality from an infectious disease, human exposure to the organisms involved may have declined, or the virulence of the organism may have diminished. However, a more likely explanation for the decline in mortality in these and other examples is that they primarily result from improvements in social conditions, safer water, and reduced exposures to pollutants and were not related to any medical intervention. In fact, Kass titled his 1971 paper, in which the graphs in Fig. 1.19 appeared, "Infectious Diseases and Social Change." Although the specific factors that were probably involved are not always clear, improved housing, including sanitation and improved nutrition, in addition to simultaneous lifestyle changes, are major factors that are likely to have contributed significantly to the decline. We are often eager to attribute temporal declines in mortality to medical interventions. However, the lesson illustrated by the examples in these graphs is that we should be cautious before we conclude that observed declines in mortality are a result of medical intervention. In view of difficulties in deriving inferences about the effectiveness of medical care solely from population-wide declines in mortality, rigorous epidemiologic studies are clearly essential to assess the effectiveness of different medical interventions. Some of the approaches used and the design of such studies for evaluating health services are discussed in Chapter 17. Integrating Prevention and Treatment Prevention and therapy all too often are viewed as mutually exclusive activities, as is shown in Fig. 1.21. However, it is clear that prevention is integral to public health, but it is also integral to clinical practice. The physician's role is to maintain health, as well as to treat disease, but even treatment of disease includes a major component of prevention. Whenever we treat illness, we are preventing death, preventing complications in the patient, or preventing the impact on the patient's family. Thus much of the dichotomy between therapy and prevention is an illusion. Therapy involves secondary and tertiary prevention, the latter denoting the prevention of complications such as disability. At times it also involves primary prevention. Thus the entire spectrum of prevention should be viewed as integral to both public health and clinical practice. Two very different decisions in 2012 placed further emphasis on the link between prevention and treatment. In July 2012 the US Food and Drug Administration (FDA) approved the use of a drug, Truvada (combination tenofovir and emtricitabine (antiviral medication); Gilead Sciences, Foster City, CA, United States), for preventing HIV infection in people who are at high risk of acquiring HIV infection (so-called preexposure prophylaxis [PrEPl). Since 2004 the drug had been marketed only for treatment of individuals already infected with HIV-both for those chronically infected and those exposed to a needle-stick or other traumatic risk (so-called postexposure prophylaxis (PEP)). The second decision, which was announced in May 2012, was that a 5-year clinical trial for preventing a genetically determined form of Alzheimer disease would be conducted by the National Institutes of Health. Investigators will study 300 people who are cognitively normal but are at very high risk for developing Alz-heimer disease. The study was initiated in 2013 and is expected to be completed in 2020. Most of the study participants will be from a large family in Medellin, Colombia, which is at high risk for a genetically determined form of Alzheimer disease, characterized by early onset of cognitive impairment followed by full dementia at approximately age 53. The drug being studied, crenezumab (antibodies against two types of human beta amyloid; Genentech, South San Francisco, CA, United States) is currently being evaluated in two other clinical trials in people who already have mild to moderate dementia, to determine whether formation of amyloid accumulation or cognitive decline can be slowed. Thus both in the study of HIV discussed in the previous paragraph and in this study of Alzheimer disease, drugs that have been used for patients with clear diagnoses of the diseases in question are now being evaluated as drugs that could prevent these diseases in high-risk patients. Both studies emphasize the need to bridge treatment and prevention in our developing views of other diseases as well.