When I first started to get interested in public health several years ago, I thought of it mostly as dealing with things like vaccines and handwashing. From one of my friends who enrolled in a Master of Public Health program, I learned that it actually covers a whole range of issues that affect the population’s health and quality of life – things like workplace and highway safety and smoking cessation, in addition to control of infectious diseases.
The word “population” is key to understanding public health. Healthcare providers focus on individual patients; public health workers focus on entire populations. Of course, many healthcare providers participate in public health work, and when they do things like administer vaccinations they’re helping promote the health of the population as well as of the individual patient.
To get a sense of public health’s scope and impact, it’s helpful to check out the Centers for Disease Control and Prevention’s Ten Great Public Health Achievements in the 20th Century. The accompanying articles on these achievements are specific to the US, but many of these improvements happened simultaneously in other countries that were able to devote sufficient resources to public health.
Ten Great Public Health Achievements in the 20th Century
At the start of the 20th century, people used to die routinely from infectious diseases like measles, polio, smallpox, and diptheria. The combination of vaccine development and policies promoting childhood immunization have brought the toll of these diseases and others close to zero. For instance, a 2008 Morbidity & Mortality Weekly Report piece on measles explains:
Before introduction of measles vaccination in 1963, approximately 3 to 4 million persons had measles annually in the United States; approximately 400–500 died, 48,000 were hospitalized, and 1,000 developed chronic disability from measles encephalitis.
Even after elimination of endemic transmission in 2000, imported measles has continued to create a substantial U.S. public health burden; of the 501 measles cases reported during 2000–2007, one in four patients was hospitalized, and one in 250 died. In 2008, CDC reported an increase in measles activity, and noted that many of the cases in children occurred in those whose parents claimed exemption from vaccination requirements.
Each state requires children to be immunized before starting school. Eleven vaccines are now recommended for use in all US children by the Advisory Committee on Immunization Practices, and the Vaccines for Children program provides these vaccines at no cost to children who might not otherwise be vaccinated due to their parents’ inability to pay.
2. Motor-Vehicle Safety
Driving was far less common at the start of the 20th century, but as more vehicles started traveling more miles we saw an increase in deaths and injuries from motor-vehicle crashes. The National Highway Safety Bureau, now the National Highway Traffic Safety Administration, was created in 1966, and its first director, Dr. William Haddon, took a public health approach to reducing deaths and injuries from crashes.
Design changes have helped prevent crashes and make those that do occur less lethal. Changes to roads have included better lighting, clearer line stripes, guardrails, and barriers between lanes of highway traffic moving in different directions. Vehicles now have safety belts, shatter-resistant windshields, and airbags.
The establishment and enforcement of traffic laws, which cover problems from excessive speed to Driving While Intoxicated, has also helped reduce motor-vehicle injuries, and public education campaigns have enhanced their effectiveness. In 1999, CDC reported that crash-related fatalities involving alcohol, which account for nearly 40% of all traffic deaths, had decreased 39% since 1982.
Still, motor-vehicle-realted crashes are still the leading causes for persons aged 1-24, and the leading cause of injury-related deaths for the US population as a whole. Distracted driving is on the rise as more people use wireless devices while driving. We’ll probably need even more prevention efforts in the coming decades.
3. Workplace Safety
At the start of the 20th century, it was hard to even know how many workers were dying or becoming ill from workplace hazards. One of the earliest data points is a survey of workplace fatalities in Allegheny County, Pennsylvania from July 1906 – June 1907; that survey reported that 526 workers *in that one county* died from “work accidents” in a single year. The 1907 explosion at a Monongah, West Virginia coal mine killed between 362 and 550 workers, and in 1911 the Triangle Shirtwaist Factory fire killed 146, many of whom leapt to their death from building windows because they had been locked in.
Efforts by advocacy groups, researchers, labor, management, state and federal labor and health agencies, and others all helped improve workplace safety, but the country saw major changes in the 1970s. In 1970, the Occupational Safety and Health Act created the Occupational Safety and Health Administration (OSHA) and National Institute for Occupational Safety and Health (NIOSH). OSHA is responsible for enforcing the OSH Act, which requires employers to provide workplaces free from recognized hazards likely to cause death or serious physical harm, and NIOSH conducts research and disseminates information on preventing workplace injuries and illnesses. Mine safety and health is handled separately by the Mine Safety and Health Administration (MSHA). OSHA and MSHA both set health and safety standards and conduct workplace inspections.
For all workers, in 2008 there were 5,214 fatal work injuries in the US. In 1999, CDC estimated, “If today’s workforce of approximately 130 million had the same risk as workers in 1933 for dying from injuries, then an additional 40,000 workers would have died in 1997 from preventable events.” It’s a big improvement, but we still have a long way to go. Annually, an estimated 49,000 deaths are attributed to workplace illnesses, and thousands more suffer injuries and illnesses that cause suffering and disability. There are still lots of things we can do make workplaces safer – for instance, increasing the number of workplace inspectors and making it easier for OSHA to issue standards and meaningful fines. (Browse through our Occupational Health & Safety archives for specifics on problems and solutions.)
4. Control of Infectious Diseases
In addition to immunizations, other measures have also dramatically improved the control of infectious diseases. CDC cites improvements in sanitation and hygiene and the discovery of antibiotics as major steps toward infectious-disease control. Penicillin was first produced in substantial quantities for medical use in the 1940s, and it and other antimicrobials have saved many lives. Here’s MMWR’s overview of sanitation and hygiene improvements:
By 1900, 40 of the 45 states had established health departments. The first county health departments were established in 1908 (6). From the 1930s through the 1950s, state and local health departments made substantial progress in disease prevention activities, including sewage disposal, water treatment, food safety, organized solid waste disposal, and public education about hygienic practices (e.g., foodhandling and handwashing). Chlorination and other treatments of drinking water began in the early 1900s and became widespread public health practices, further decreasing the incidence of waterborne diseases. The incidence of TB also declined as improvements in housing reduced crowding and TB-control programs were initiated. In 1900, 194 of every 100,000 U.S. residents died from TB; most were residents of urban areas. In 1940 (before the introduction of antibiotic therapy), TB remained a leading cause of death, but the crude death rate had decreased to 46 per 100,000 persons (7).
Advancements in detecting and monitoring diseases have also helped; these include serologic testing, tissue cultures, and molecular techniques for determining whether a sample is infected, as well as electronic communications that make it easier to share information about outbreaks and track disease patterns.
The AIDS pandemic shows how far we still have to go in infectious disease control, and antibiotic-resistant bacteria pose new threats.
5. Declines in Deaths from Heart Disease and Stroke
Since 1938, heart disease has been the leading cause of US deaths and stroke has been the third leading cause (cancer is currently #2); the good news is that age-adjusted death rates from both diseases has declined. Advances in diagnosis and treatment of these diseases have been important; these include things like more emergency services for heart attacks and stroke as well as the development of medications for treating high blood pressure and high cholesterol.
There’s also good news on the prevention side. MMWR reports that blood-pressure and blodd-cholesterol levels have both decreased, and those drops are likely due (at least in part) to drops in smoking and in the consumption of saturated fat and cholesterol. But a century ago, we didn’t even know these risk factors needed to be controlled. Landmark epidemiologic studies, including the Framingham Heart Study and work by Ancel Keys, identified these and other risk factors. Clinical trials demonstrated the efficacy of antihypertensive and lipid-lowering drugs, and national programs educated providers and the general public about the role of lifestyle factors.
One sobering finding is that heart-disease and stroke mortality have not declined equally among all segments of the population, and some risk factors have stopped improving or even worsened. Approximately 70% of people with hypertension don’t have their condition under adequate control, and the prevalence of obesity is increasing. These issues remain to be addressed in the 21st century.
6. Safer and Healthier Foods
Early in the 20th century, contaminated food and water often caused typhoid fever and other foodborne diseases, but improvements in sanitation, refrigeration, pasteurization improved the safety of the food supply. In 1900, the incidence of typhoid fever was roughly 100 per 100,000 population; by 1950, that had dropped to just 1.7. Thanks in part to Upton Sinclair’s muckraking novel The Jungle, which exposed horrific conditions in the meatpacking industry, the Pure Food and Drug Act was passed in 1906.
During the 20th century, researchers also discovered that diseases like rickets, scurvy, beri-beri, and pellagra were caused by nutritional deficiences. In the following decades, government agencies issued dietary recommendations, developed food relief and commodity distribution programs (including school feeding), and required food enrichment. Adding Vitamin D to milk and niacin to flour have helped make rickets and pellagra virtually unheard of in the US today. More recently, enrichment of grain and cereal products with folic acid has reduced the incidence of birth defects.
Of course, recent outbreaks of foodborne disease show that we still have work to do on food safety, and an increasing obesity rate demonstrates that many of us still aren’t eating optimally nutritious diets.
7. Healthier Mothers and Babies
MMWR lays out the statistics on death during childbirth and infancy:
At the beginning of the 20th century, for every 1000 live births, six to nine women in the United States died of pregnancy-related complications, and approximately 100 infants died before age 1 year (1,2). From 1915 through 1997, the infant mortality rate declined greater than 90% to 7.2 per 1000 live births, and from 1900 through 1997, the maternal mortality rate declined almost 99% to less than 0.1 reported death per 1000 live births (7.7 deaths per 100,000 live births in 1997) (3).
Poor aseptic practices and excessive operative deliveries contributed to high maternal mortality during the first decades of the 20th century; a shift from home to hospital births, along with medical advances (antibiotics, safe blood transfusions, etc) helped more mothers survive.
The same improvements in water, sanitation, and medicine that have benefited the population as a whole have helped infant mortality rates fall. An emphasis on prenatal care, advocated by the Children’s Bureau in the early part of the century, has also been important. After Medicaid and other federal programs were implemented, infant mortality declined substantially; today, Medicaid pays for more than 40% of US births. Medical advancements have also improved survival rates among low-birth-weight babies, although low birth weight remains a major concern.
The US still has higher maternal and infant mortality rates than other countries do, and rates are higher for black women and infants than for their white counterparts.
8. Family Planning
When people can choose how many children to have and when, they tend to have fewer children and allow for longer intervals between births. These changes contribute to better health for infants, children, and women. In 1900, the average US family had 3.5 children; the number dropped to 2.3 children in 1933, then jumped to 3.7 during the baby boom. Since 1972, US average family size has remained at approximately two children.
Public health nurse Margaret Sanger began the modern birth-control movement in 1912, and by the 1930s a few state health departments and public hospitals were providing family planning services. In 1960, the birth control pill and intrauterine device became available; by 1965, the pill was the most popular birth control method, followed by the Condom. Congress passed the Family Planning Services and Population Research Act in 1970 and authorized Medicaid funding for family planning services in 1972. The Supreme Court also played a role in family planning, ruling it unconstitutional for states to prohibit married couples’ use of contraception (Griswold v. Connecticut, 1965) or to ban abortions (Roe v. Wade, 1973). Publicly subsidized family planning clinics serve an estimated 6.6 million women and prevent an estimated 1.3 million pregnancies annually.
Unintended pregnancy remains a problem in the US; even today, 49% of pregnancies are unintended, and more than half of these end in abortion. Political battles continue to erupt over federal support for family planning services.
9. Fluoridation of Drinking Water
Dental caries (which most of us refer to as cavities) can lead to incapacitating pain and severe infections. Extensive dental caries were common in the early 20th century in most of the US, but dentist Frederick McKay established a practice in Colorado Springs and noticed that many of his patients had stained teeth but seemed less susceptible to caries. Upon investigation, he concluded something in the public water supply was probably responsibile. When aluminum-company chemist HV Churchill identified high concentrations of fluoride in an Arkansas well where another dentist had noted mottled enamel on children’s teeth, McKay sent him a water sample and found that it, too, contained high levels of fluoride. Further investigation by other researchers concluded that adjusting flouride levels confirmed that higher fluoride levels correlated with lower prevalence of caries and identified an optimum range of fluoride concentration (0.7-1.2ppm). Fluoridation of public water supplies was rapidly adopted and dental caries dropped dramatically during the second half of the 20th century. For 12-year-old US residents, the mean number of decayed, missing, or filled teeth fell from 4.0 in 1966-1970 to 1.3 in 1988-1994.
Not all communities have fluoridated water, and the rate at which fluoridation is spreading has slowed markedly. It’s also important to note that while fluoridation has helped reduce the incidence of dental caries population-wide, adequate dental care is still essential for oral health – but not everyone practices sufficient dental hygiene or has access to dental professionals. In 2007, 12-year-old Deamonte Driver of Maryland died from a brain infection that originated in one of his teeth.
10. Tobacco as a Health Hazard
In the US, smoking is the leading preventable cause of disease, disability, and death; it’s responsible for an estimated 443,000 premature deaths each year. After increasing at the start of the 20th century, smoking decereased. Annual per capita cigarette consumption was 4345 cigarettes in 1963 and 2261 in 1998. Between 1964 and 1992, approximately 1.6 million smoking-caused deaths were prevented.
Based on 7,000 articles on the relationship between smoking and disease, the Advisory Committee to the US Surgeon General “concluded that cigarette smoking is a cause of lung and laryngeal cancer in men, a probable cause of lung cancer in women, and the most important cause of chronic bronchitis in both sexes.” Smoking and Health: Report of the Advisory Committee to the Surgeon General was published in 1964, and over the course of the next decade it became common knowledge that smoking harms our health.
In the years since the report was issued, both state and federal governments have adopted a range of policies designed to reduce smoking: requiring warning labels on cigarette packaging, limiting tobacco advertising, taxing tobacco products, prohibiting the sale of tobacco products to children, restricting smoking in public places. Public education campaigns have also been conducted, some by the American Legacy Foundation, which was established as a result of the 1998 Master Settlement Agreement between tobacco companies and attorneys general of 46 states.
In 2008, CDC reported that 20.6% of US adults smoke, and smoking prevalence is higher among those who have less education and live below the federal poverty level. Meanwhile, tobacco companies are aggressively marketing their products in countries with fewer limits on sales and advertising, and rates of smoking are on the rise in the developing world.
During the 21st century, we hope more countries will be able to make these kinds of strides in public health, and that countries like the US will be able to preserve and exceed our substantial public-health gains. Ninety years from now, what will we count as the top public-health achievements of this century?