To Understand Anti-vaxxers, Consider Aristotle
Among the many difficulties imposed upon America by the pandemic, the scourge of anti-vaccine sentiment—and the preventable deaths caused as result—ranks among the most frustrating, especially for infectious-disease doctors like me.
People who are hospitalized with COVID-19 rarely refuse therapy, but acceptance of vaccines to help prevent infection has been considerably more limited. Seventy percent of Americans have received the initial complement of vaccine injections, and many fewer have received the boosters designed to address viral variants and confer additional protection. Why are so many people resistant to this potentially lifesaving treatment?
Some explanations are unique to our era—the awful weaponization of science in a deeply partisan political environment during the age of social media, for instance. But the concept of vaccine hesitancy is not new. Such hesitancy is, in a larger sense, a rejection of science—a phenomenon that far predates the existence of vaccines.
One of the earliest documented controversies in science denialism comes from the field of astronomy. In the third century B.C., the Greek astronomer Aristarchus of Samos proposed a heliocentric model of the universe. The idea that the Earth and planets might revolve around the sun, rather than the other way around, was shocking at the time, and Aristarchus’s theory was quickly rejected in favor of models such as those put forth by Aristotle and Ptolemy, both of whom insisted that the Earth was the center of the universe. The fact that Aristotle and Ptolemy remain better known today than Aristarchus shows the force of the rejection. It would be some 2,000 years before the notion was seriously reconsidered.
In the 1530s, the Polish astronomer Nicolaus Copernicus developed his own heliocentric model based on astronomical observations. Copernicus is remembered today primarily for this perspective-changing discovery. But it’s worth noting that he delayed publication of his findings until 1543, the year of his death, perhaps for fear of scorn or religious objections.
In the early 17th century, Galileo Galilei, the Italian astronomer known as the “father of modern astronomy,” recognized that explaining the celestial changes in the position of stars and sun over time required that the Earth revolve around the sun. Galileo fully and publicly supported the Copernican theory of a heliocentric universe, and condemnation from the Vatican was swift and harsh. He was tried by the Inquisition and threatened with excommunication if he did not recant. Rather than incur the wrath of the pope, he finally agreed that he was wrong. He spent the remainder of his life under house arrest. It would be another 180 years before the Church admitted that Galileo was right.
Rejections of scientific advances are found throughout the history of medicine. There have been four great advances in medicine over the past 200 years: anesthesia, antisepsis, antibiotics, and immunization. Not every advance was met with resistance. When the benefits of the advance have been obvious, there has tended to be little hesitation. Anesthesia and its cousin, analgesia, for instance, were rapidly accepted; they relieved pain, and the advantages were readily appreciated.
Antisepsis had a stormier path to public acceptance. In the 19th century, English and Irish physicians recognized that puerperal sepsis (a dangerous infection in a mother after delivery of a baby) was likely a contagious condition that was spread from patient to patient either by the medical staff or the local environment. They suggested that improving hygiene would reduce the high rates of mortality that puerperal sepsis caused. In 1843, Oliver Wendell Holmes Sr., a physician (and one of The Atlantic’s founders), presented a paper to the Boston Society for Medical Improvement titled “The Contagiousness of Puerperal Fever.” Holmes suggested that unwashed hands among the medical and nursing staff were responsible for transmitting puerperal fever. This did not sit well with the establishment. A prestigious Philadelphia obstetrician, Charles D. Meigs, declared Holmes’s findings to be nonsense and suggested that an increased number of cases among any physician was just bad luck.
The physician who is most frequently recognized with establishing the contagious nature of this infection is a Hungarian obstetrician, Ignaz Semmelweis. He noted that patients in the Vienna General Hospital who were cared for by physicians had a higher incidence of postpartum sepsis than those who were cared for by midwives. Semmelweis realized that physicians performed autopsies, whereas midwives did not, and that physicians did not wash their hands or clothing before moving from an autopsy to a delivery. (It was routine for them to attend deliveries in their bloodstained clothing, having come directly from the autopsy suite.) When he suggested simple hygiene measures such as handwashing, he was derided and eventually run out of town. The medical establishment was unwilling to accept that physicians—rather than bad air or host weaknesses—were responsible for spreading infections and harming patients.
Science denialism can work in the other direction too. When antibiotics, especially penicillin, were first introduced, they were rightly appreciated as miracle drugs. In the pre-antibiotic era, the leading cause of death among children was infectious diseases. The use of antibiotics was astoundingly successful against many, but not all, childhood diseases. The downside for this enthusiasm for treatment came when patients demanded antibiotics for conditions—such as viruses—that didn’t actually necessitate them. Fifty years ago, telling a patient that they had a virus and that penicillin was therefore of no use led to disappointment, disbelief, and even arguments from patients requesting antibiotics for simple colds. Many doctors gave in because it was simpler than spending time fighting with a patient. A consequence of the more indiscriminate use of antibiotics—which represents its own mini-genre of science denialism—has been increased bacterial resistance.
But of the four great advances, none has so broadly helped humanity, or suffered more from science denialism, than immunization. Most, but not all, of the vaccines that scientists have developed since the first immunizations in the 18th century have been developed against viruses. Of all viral infections, the most feared may well have been smallpox. Over the course of the 20th century alone, an estimated 300 million people died of smallpox. Smallpox is highly contagious and spares no age group or class. Its common form has an estimated overall mortality of roughly 30 percent, but the mortality of hemorrhagic smallpox—a more severe form of the disease—approaches 100 percent. Smallpox is also wildly contagious, a characteristic that is most evident when a previously unexposed population is exposed. Smallpox was unknown in the Americas before European explorers brought cases to the New World. The disease decimated the Indigenous populations of North America and South America as a result.
The early concept of immunization to prevent smallpox may have begun more than 1,000 years ago, in China. The history is contested, but some documents show that children would be made to inhale material from a ground-up, mature smallpox lesion scraped off of the body of the infected—a level of exposure that could trigger a person’s immune response to smallpox without causing a full-blown infection. A later technique, which involved scratching the skin of an uninfected individual with material from another person’s lesion, was observed by the wife of the English ambassador to Istanbul, who then brought this procedure to Europe. She was so impressed that she had her children immunized. Subsequently, an experiment was done in which six prisoners in London were immunized. Despite exposure to smallpox, none of them became ill.
Like many advances in medicine, smallpox immunization was met with some resistance, including worry that immunization might inadvertently spread the disease to others. This was an understandable reaction; the live smallpox virus was used, and a small percentage of inoculated individuals did develop full-blown disease and die. In 1721, there was an outbreak of smallpox in Boston. The writer and clergyman Cotton Mather urged widespread immunization but had only moderate success because of resistance from the local population. (History complicates even the views of those who embrace science: Mather was also an ardent defender of the Salem witch trials.) Years later, a well-known case of immunization resistance occurred in Philadelphia. During an outbreak of smallpox in 1736, Benjamin Franklin’s 4-year-old son, Francis, became infected and died. Francis had not been immunized despite an opportunity to do so, and Franklin said he regretted the decision for the rest of his life.
In the generations that followed, scientists built off of these earlier methods and eventually developed a stable and widely available smallpox vaccine. The global eradication of smallpox as a result remains one of the greatest accomplishments in the history of medicine. The last case of naturally occurring smallpox was reported more than 40 years ago.
Even so, vaccine hesitancy has persisted. In America, new vaccines for other diseases have continued to prompt their own waves of skepticism and hostility. And although science denialism is not pervasive in the way it once was centuries ago, it still rears its ugly head. The arrival of the COVID-19 vaccines brought pernicious vaccine sentiments into the spotlight. The reasons for this vehemence are many. For instance, some people who might accept the efficacy of a vaccine have such a fear of injections that they simply avoid seeking medical care until absolutely necessary. But this represents a minority of those who reject the vaccines.
A more common—and more insidious—force that pushes people away from lifesaving vaccines appears to be swelling distrust in expertise, which is both a political and cultural phenomenon. Vaccine resistance can be peddled by influential people in both liberal and conservative circles, but throughout the pandemic, right-wing anti-government organizations and television personalities in particular have promoted a stew of outrageous conspiracy theories about vaccines. Run-of-the-mill misinformation remains a problem too. Some people continue to believe that the COVID-19 vaccine will infect you and make you sick—this is not the case. Finally, of course, there are concerns about known and unknown side effects from the vaccination. Like many vaccines, the COVID shots are linked to serious health effects in extremely rare circumstances; for instance, Moderna’s and Pfizer’s mRNA shots are associated with a very small risk of heart inflammation. It is virtually impossible to prove that some side effect will not ever occur. But hundreds of millions of people have safely received the COVID vaccine in the United States alone.
Perhaps the greatest disservice to vaccination has been the fraudulent claim that childhood vaccines cause autism. This claim was originally published in an otherwise respected medical journal in the 1990s, and has since been fully retracted. (The author lost his medical license.) Nevertheless, many people still believe this and have put their children at risk for serious illness as a result.
Our advances in science over the past two centuries have truly been extraordinary, but our society still suffers from the forces that reject reason and prevent our ability to take full advantage of discoveries that protect us all. And we need to push back against those who endanger others because they see opportunities for fame or profit in spreading dangerous disinformation. Until that happens, our species will continue to understand the world around us in fits and starts—with too many people dying, even when we know how to save them.