March 18, 2022
Mike Magee
The title of today’s lecture is “The History of Epidemics in America.” In planning this presentation, I was acutely aware that the challenge was to avoid an endless listing of repugnant and deadly microbes lined up with dates, mortality figures, and various elements of human mayhem. To do so would not give the topic its due, would be unduly distressing and ultimately of limited interest to most.
This is because each epidemic occurs in a specific time and place, engenders a human response that is both individual and societal, creates tension and drama, uncovers structural vulnerabilities, and often unleashes scapegoating and recrimination. In short, epidemics are social, political, philosophical, medical and above all ecological. They are also narratives – with a beginning, middle and end, and a range of heroes and villains, both human and microbial.
Over the next hour or so, we’ll meet many of these characters, visit their stories, reveal their unique histories, and draw a range of lessons – often unheeded by future generations.
Let’s begin here. In the late 19th century, orange growers in California reached industrial scale. But in 1888, little white cushions began to appear on the branches of their trees. Within them were tiny, sap-sucking insects that multiplied with alarming speed and damaged the trees production of fruit. The responsible “scale insect”, it was found, was a foreign invader that had arrived that year on a frigate from Australia.
In Australia, the scale insect fed primarily on the native acacia trees. Orange trees were infested as well, but rarely damaged. This was because the insects numbers were naturally controlled by a local orange beetle with black spots called the ladybird beetle. Nobel Laureate ecologist, Australian Macfarland Burnet in 1972 explained that “If the scale insect is particularly plentiful, the ladybird larvae find an abundant food supply, and the beetles in turn become more plentiful. An excessive number of ladybirds will so diminish the population of scale insects that there will be insufficient food for the next generation, and therefore fewer ladybirds.” In Australia, the scale insect and ladybird beetle were in perfect balance most of the time. A truce of ecological sorts had been established.
But in California, there were no ladybird beetles. And so the agricultural leaders in 1889 imported 514 beetles. Prodigious multipliers, they numbered 10,555 by the next year and were distributed to 208 growers. Once they reached adequate numbers in the orchards, the scale beetle “was reduced in importance to a relatively trivial pest.”
Simple, right? Well not exactly. As Burnet explained, “The mutual adjustment is an immensely complicated process, for all the food chains concerned are naturally interwoven, and for every species there will be fluctuations in numbers from time to time, but on the whole, in a constant environment a reasonable approach to a stable balance will be maintained.”
For predators of any shape or size (and that includes bacteria and viruses), “there is less opportunity for enemies…of restricted prey to thrive at their expense.” In the present case with Covid, vaccination, masking, and distancing, in effect, restrict us as potential prey to COVID-19.
Another point. Our ecologists remind us that “Most parasites are restricted to one host species (for their nutrition)…and the main problem that a parasitic species has to solve, if it is to survive, is to manage the transfer of its offspring from one individual host to another.” That often requires intermediate hosts “whose movement or activities will help the transfer to fresh, final hosts…an increased density of the susceptible population will facilitate its spread.”
Some of these hosts appear quite innocent. Consider the black “ship” rat that was domesticated in the 15th century. The fact that it carried the deadly plaque causing bacteria, Yesinia pestis, went unappreciated. The black rats died in bunches preceding waves of human catastrophe, but no one at the time made a causal association. Rather the ship rodents deaths were met with casual observation as documented in Albert Camus’s The Plague:
“When leaving his surgery on the morning of April 16, Dr. Bernard Rieux felt something soft under his foot. It was a dead rat lying in the middle of the landing. On the spur of the moment he kicked it to one side and, without giving it a further thought, continued on his way downstairs.”
As historian Charles Rosenberg observed, “most communities are slow to accept and acknowledge an epidemic. To some extent it is a failure of imagination; perhaps even more it is a threat to interests, to specific economic and institutional interests and, more generally, to the emotional assurance and complacency of ordinary men and women.”
To site a modern example, a certain percentage of fully boosted and immunized are able to be infected by the Omicron variant of Covid and remain asymptomatic carriers and spreaders, especially if they enter dense gatherings where they and unvaccinated and unmasked persons are present in crowds.
In the eyes of an ecologist, all living organisms are survivalists, and there is little difference (except in size) between a parasitic microorganism and a large predatory carnivore. They all need nourishment. As our experts write, whether the bite comes from inside or out, “It is just another method of obtaining food from the tissues of living animals.”
Our current nemesis, the COVID-19 coronavirus is an organism that is “smaller and less highly differentiated than its host…and gains its nourishment at the expense of the host’s living substances.”
Until late in the 15th century, the Americas were virgin territory when it came to widespread epidemics. Not so, of course, of Eurasia which suffered three waves of pestilence driven plague, carried by fleas embedded in the fur of black ship rats sometimes domesticated as pets. Humans, rats, fleas and plague bacteria traveled together along Mediterranean trade routes. Other bacteria and viruses were chronically embedded in a range of European domestic farm animals including horses, cows, pigs and goats.
We’ll come back to that story in a moment. But for now, let’s focus on a different trade route famously termed the “Columbian Exchange” by University of Texas historian, Alfred Crosby, in 2010. European monarchs in the 15th century supported oceanic exploration as an extension of their power bases. New trade routes and territories carried the promise of the “three G’s” – gold, glory and God. The monarchs allied with merchants and explorers, and the Catholic Church willingly opened its coffers seeing the potential to spread Christianity to new lands.
Technologic advances like the astrolabe, the magnetic compass and sea worthy vessels made these ventures still dangerous, but feasible. The death and destruction of the indigenous people followed close behind their arrival. As Alfred Crosby documented, “Indigenous peoples suffered from white brutality, alcoholism, the killing and driving off of game, and the expropriation of farmland, but all these together are insufficient to explain the degree of defeat. The crucial factor was not people, plants or animals, but germs.”
Consider Columbus’s arrival on the island of Hispaniola (now the Dominican Republic and Haiti) in 1492. Documents suggest he was greeted peacefully by the native Taino tribe which numbered some 60,000. By 1548, the numbers had plummeted with less than 500 of the indigenous tribe surviving. What had happened?
The arrival of Columbus and others, and their subsequent movement back and forth between the Old World and the New World led to an unprecedented exchange of plants, manufactured goods and raw materials, tools and technologies, ideas, and microbes.
In the pursuit of wealth, traders and merchants, with financial inducements by their governments, clear cut and developed large plantation farming of cash crops like sugar, tobacco and wheat for export. These crops demand huge workforces for planting and harvesting under brutal and dangerous conditions. The plan initially was to enslave the natives they encountered and maintain a system of forced labor. To assist the effort, they also imported large numbers of domesticated animals from Europe including horses, cows, pigs, goats and sheep. At the time, the only domestic animals on the island were llamas and alpaca.
But the animals carried with them a wide range of infectious diseases including smallpox, chickenpox, measles, mumps and typhus. Over hundreds of years, the Europeans had developed immunities to these diseases. But the native Americans were immunologically naïve. By some estimates, 90% of the indigenous population in South and North America perished. Beyond the human tragedy, their demise created an enormous shortage of labor on the plantations. The solution chosen by the English, Spanish, Portuguese and French conquerors was to begin large scale importation of African slaves.
As journalist Charles Mann outlined in his book “1493: Uncovering the world that Columbus created”, “The scale of the trade was staggering. Between 1492, when Columbus landed, and the early 1800s, more than 2 out of every 3 people who came to the Americas were enslaved Africans. At the time, this human wing of the Columbian Exchange was the biggest migration in history.” Over 10 million enslaved Africans overall were transported to America, with an additional 1.5 million dying in transit.
Warrior conquests and disease have been frequent traveling companions throughout history. In fact, microbes have repeatedly altered the course of human history. In 1789, a Caribbean colony, Saint-Domingue on the western side of the island of Hispaniola, was fast becoming an endless source of wealth for its French conquerors. The eastern portion of the island, current day Dominican Republic, had been awarded to the Spanish. But the French controlled Saint-Dominque (the future Haiti) by then possessed eight thousand plantations producing sugar, coffee, cotton, tobacco, indigo and cacao.
The vast majority of the indentured natives had already died from progressive waves of diseases arriving on the backs of European farm animals. Replacing them were huge numbers of African slaves. The year 1764 marked the importation of more than 10,000 African men, women and children. By 1787, this had grown to 40,000. By July 14,1789, when the French Revolution commenced with the Fall of the Bastille, the tiny island was the home to nearly 500,000 slaves compared to 700,000 in the whole of the United States. The brutal conditions ensured that the average lifespan was approximately 5 years, requiring a constant resupply of enforced human labor. This also insured that the slaves never accepted their new lives, with constant uprisings, and deaths regularly exceeding births in the enslaved population.
The contrasts were stark. Historian Frank Snowden describes it this way: “It was said that an acre of land on a Saint-Domingue plantation yielded more wealth than an acre anywhere else on earth. At the same time, the same area enclosed what many regarded the highest concentration of human misery…Men and women who had recently arrived in chains did not regard slavery as natural or permanent.” To create the plantations, forests had to be cleared which in turn destroyed the natural habitat of insect-devouring birds. The deforestation also created marshes, collections of still water and mounting silt, all of which supported explosive growth of insect populations – most notably the Aedes aegypti mosquito.
This mosquito is the vector of choice for the Flavivirus that causes Yellow Fever. A competitor, the Anopheles mosquito, transmits the protozoa, Plasmodium, that causes malaria. African slaves whoo had survived these endemic diseases in their homelands often arrived as carriers with the Flavivirus and Plasmodium in their blood.
By the end of the 18th century, revolution was in full swing in the future Haiti led by the dynamic Toussant Louverture, known popularly as “The Black Sparticus.” Out of the ashes of the French Revolution, Napoleon rose to power in a military takeover of the country. While waging war and skirmishes on his own continent, he watched the colonies closely, both as a source of fabulous wealth, but also with significant holding in North America which included 828,000 squares miles, containing what would become the middle third of the United States. He saw in Louverture a direct competitor, and also a potential massive regional destabilizer. If slavery went down in Saint-Domingue, France’s slave colonies in Guadeloupe, Martinique, Reunion, and Guiana would also likely collapse.
With British and U.S. endorsement, Napoleon took action in 1801, deploying 30,000 soldiers and an armada of 65 ships to the region led by his brother-in-law, General Charles Leclerc. The arrival in February was met, at the direction of Louverture, by surprising non-engagement. The “Black Sparticus” was aware that his African former slaves were largely immune to Malaria and Yellow Fever, and that the Europeans were not. He therefore decided to wait for the summer insect infestation to decimate the French troops. The epidemic that engulfed them eventually killed over 50,000 soldiers, including Leclerc on November 2, 1802, 90% from Yellow Fever.
The bad news didn’t stop there. Napoleon had harbored an interest in establishing a powerful French empire in the Americas. The loss of Saint-Domingue eliminated the forward base for staging additional North American conquests. In his eyes, these far away lands were now “indefensible liabilities.” In 1803, he sold it all – all 828,000 square miles – for $15 million dollars or 4 cents an acre. That land, which would eventually be segmented into 15 new states, was largely the property of native Americans. A study in the Journal of American History calculated the eventual real cost at $2.3 billion. It included “222 Indian cessions within the Louisiana Territory including treaties, agreements, and statutes between 1804 and 1970, …covering 576 million acres, ranging from a Quapaw tract the size of North Carolina sold in 1818 to a parcel smaller than Central Park seized from the Santee Sioux to build a dam in 1958.”
Apparently, Napoleon never fully absorbed the full meaning of this loss. In 1812, at the age of 43, he ventured into the same territory with even more disastrous results. In deciding to venture 1500 miles to invade Russia, he was ambushed coming and going by microbes. On his tortured path to Moscow, he lost 1/3 of his army (120,000 men or 4000 each and every day) to epidemic dysentery caused by drinking water and food soiled with the Shigella bacteria. If that wasn’t bad enough, in his winter-time retreat from Moscow, he lost most of the remaining troops to a different pathogen. This was not transmitted by human aerosol droplets or fecal contaminants, but by the bites of body lice clinging to the heavy garments that prevented the soldiers from freezing to death on their 83 day march home. The body louse, Pediculus humanus, was the primary carrier of the bacteria Rickettsia prowazekki, and feeds almost exclusively on human blood. While the bacteria’s name may not be familiar, the disease it causes is – Typhus. Over half of those infected died.
As historian Frank Snowden summarized, “Just as yellow fever in Saint-Domingue stopped the western expansion of Napoleon’s empire, so dysentery and typhus halted its advance to the east. Indeed, the two diseases played a major role in causing regime change in France. After the Russian fiasco, Napoleon was permanently weakened and never able again to construct an army of comparable power.”
In a remarkable short period, European aggressors manage to topple a wide range of rich governments, cultures and economies throughout the Americas. In 1519, only 27 years after the arrival of Columbus, the Aztec civilization was gone. It took 40 years of assault beginning in 1532 to eliminate the Inca Empire. In both cases, disease was the final determinant.
A wide range of diseases came with the Columbian Exchange. The one best documented by the native peoples because of the associated disfigurement was smallpox. As one Aztec wrote, “There spread over the people a great destruction of men. [Pustules] were spread everywhere, on one’s face, on one’s head, on ones breast. There was indeed perishing, many indeed died of it…and many just died of hunger. There was death from hunger. There was no one to take care of another. There was no one to attend to another.”
The imperial assault on Native Americans followed a similar pattern as with their neighbors to the south. The difference was the length of the engagement, which lasted several centuries, and arguably persists to this day, as evidenced by marked health disparities in Native Americans. The well documented Trail of Tears with forced relocation of Cherokees in 1838 carried with it a 25% mortality not only from epidemic dysentery but also a range of communicable diseases like whooping cough caused by the respiratory bacteria, Bordetella pertussis, transmitted by coughing or sneezing.
In North America, European domestic animals once again were effective microbe transporters. Cattle and horses arrived in Virginia in 1620 and in Massachusetts in 1629. By the early 1630’s, smallpox had resulted in the mass extermination of Algonquin’s in Massachusetts. As early as 1580, there is documentation of Native Americans adjacent to the Virginia colony at Roanoke “beginning to die quickly…the disease so strange that they neither knew what it was, nor how to cure it.” As with South America, loss of adult caregivers extended the killing rate. As reported from the Plymouth colony, colonists and Native Americans “fell down so generally of this disease as they were in the end not able to help one another, no not to make a fire nor fetch a little water to drink, nor any to bury the dead.”
The pattern continued unabated into the next century as western expansion of European settlers and their domesticated animals encountered native tribes. In 1728, it was the Cherokee’s. In 1759, the Catawbas. In 1800, 2/3 of the Omahas perished. In 1838, the entire Mandan tribe and half of the peoples of the High Plains disappeared.
While Native Americans were especially vulnerable due to lack of immunity to microbes arriving with travelers from Europe, Europeans had their own vulnerabilities including lapses of immunity and knowledge. A prime example was Yellow Fever, caused by the transmission of a Flavivirus by the Aedes aegypti mosquito. African slaves, imported to work the Caribbean plantations, were largely immune to the African endemic disease. This made them effective asymptomatic carriers when they arrived at the insect plagued plantation fields. Products from those plantation were carried by ships to the major American cities, and along with the product came swarms of infected mosquitoes.
When Yellow Fever broke out shortly after the arrival of a trading ship from Saint-Dominque, in Philadelphia among colonists with no immunity in 1793, the main response was panic, fear, and mass evacuation from the city. 5000 citizens, roughly 10% of the population, fled including Alexander Hamilton and his wife. Experts were at a loss to explain the cause, and were even more confused how to treat the disease.
The city at the time harbored two branches of medicine – the Heroics and the Homeopaths. Benjamin Rush led the Heroic branch of Medicine, believing the problem involved a disturbance of the four humors – blood, phlegm, black bile and yellow bile.His solution was the rather liberal and barbaric use of cathartics and blood letting, For the most severe cases, Rush set up the first “fever hospital” in the new nation.
His intellectual opponent was the father of Homeopathic Medicine, Samuel Hahnemann. His guiding principle was that “likes are cured by likes.” This translated into delivering primitive pharmaceuticals to suffering patients that would cause a milder version of the symptoms then those they were currently suffering. In the body’s response, he reasoned, would come a kinder and gentler cure than what Rush was proposing.
Neither doctor was especially successful in halting the 1793 catastrophe, let alone explaining its cause. But in expounding their theories, each became famous, and ignited a 150 year struggle between the two branches. The Heroics descendents ultimately launched the American Medical Association in 1849, with the American Homeopathic Society arriving a few years later. It would take nearly another century for the AMA to eliminate their competitor.
As for the explanation of Yellow Fever, that would elude scientists until late in the 18th century. In the meantime, outbreaks of the viral disease continued to haunt American cities. 730 died in New York City in 1795. 5000 perished in Boston, New York and Philadelphia in 1798. Baltimore suffered 1,200 casualties in 1800. More than 8000 died in Baltimore in 1800. 2000 were lost in Norfolk in 1855, and 20,000 died in the Mississippi Valley in 1878.
Finally in 1898, when an outbreak left hundreds dead in Cuba, the U.S. Army established the Yellow Fever Board led by Walter Reed. This was a full 12 years after Cuban scientist, Carlos Finlay, presented a paper, “The Mosquito Hypothetically considered as the Transmitting Agent of Yellow Fever” was presented to the Havana Academy of Science, and was widely ridiculed. It took two decades, and the work of Walter Reed’s team, to confirm Finlay’s observations, and 25 years before the offending microbe, a virus, was confirmed. As for a cure, the vaccine to prevent Yellow Fever was finally created in 1930 by Harvard virologist, Max Theiler, and earned him a Nobel Prize.
When it comes to the history of epidemics, fear and ignorance have always been major obstacles. In the absence of knowledge, leaders often relied on common sense. One of the earliest terms used for epidemics, the “plague”, was nearly synonymous with terror. In most cases, too few people were often left to bury the dead, let alone care for vulnerable survivors.
Historians note that these mysterious waves of disease tended to stimulate rational, rather than religious of mysterious responses. As historian Frank Snowden noted, “Healers abandoned incantations, spells and sacrifices…eliminated exorcism, and renounced the appeasement of the gods.” In his book, “Epidemics and Society”, Snowden extensively reviews the human response to Bubonic Plague, caused by the bacteria, Yersinia pestis, carried by rat fleas. The bacteria rapidly invades the lymph system, causing swollen nodes or “bubos” from which the disease’s name is derived.
Occurring in three deadly waves, outbreaks spanned nearly 1500 years. The first major outbreak, the Justinian Plague named after Emperor Justinian I, lasted 200 years (541-755) in the Nile Delta, and killed 20 to 50 million people. The second, the “Black Plague” named for the typical bodily discoloration in the dead and dying, lasted 500 years (1330-1830) and traveled along Mediterranean trade routes with outbreaks in Florence(1348), Milan (1630), Naples (1656), London (1665), Netherlands (1710), France (1720), Italy (1743). The third went global and lasted 40 years (1855-1894), striking nodal cities worldwide including Honolulu and San Francisco and killed 20 million.
Humans viewing the plague took a surprisingly long time to put 2 and 2 together. Their initial response, including doctors and patients, was flight. But they also organized the first boards of health in their cities and exercised quarantines and blockades. The word quarantine derives from the Latin words quaranta giorni meaning forty days. The word was first used in Venice where, without knowing the cause, officials recognized that an outbreak had coincided with the arrival of a trade ship and a “portable something.” Thereafter, ships with sick mates were required to dock in the harbor for forty days before being cleared to disembark. The quarantines were often reinforced with naval blockades. On dry ground, the sick and those exposed to the sick were isolated to specific geographic areas called sanitary cordons and detained and isolated in pesthouses.
As one would expect, cities were always the most vulnerable, not only due to congregate crowding, but also because of general filth and primitive infrastructure. This began to change with the arrival of the “Sanitary Movement” championed by hospitals in Germany and France in the mid-18th century. England translated the movement into policy in 1842 based on the “Report on the Sanitary Condition of the Labouring Population of Great Britain” (known simply as, “The Sanitary Report”) which detailed the filthy working and living conditions throughout the nation. The report was written by a Manchester barrister named Edwin Chadwick.
Leaders were beginning to connect the dots, tying cause and effect. Thus filth led to poverty, and poverty to lost economic productivity and societal decline. As one expert described, “The social conditions that Chadwick laid bare mapped perfectly onto the geography of epidemic disease…filth caused poverty, and not the reverse…Filthy living conditions and ill health demoralized workers, leading them to seek solace and escape at the alehouse. There they spent their wages, neglected their families, abandoned church, and descended into lives of recklessness, improvidence, and vice. The results were poverty and social tensions…Cleanliness, therefore, exercised a civilizing, even Christianizing, function.”
Chadwick advanced modern science as it was at the time. William Harvey had just described the human circulatory system. Chadwick seized on the metaphor to describe his vision of plumbing infrastructure for England’s urban encampments. As others described the plan, “Chadwick referred to his famous reform as an ‘arteriovenous system.’ The circulatory analogy made the system easy to explain and emphasized the point that it was necessary for life itself. Newly laid water mains – the arteries of the system – would deliver an abundant supply of clean water – the first coefficient of health – to every town and city in Britain. Connector pipes as capillaries would then deliver water to every household, where…’ready accessibility’ would allow cleansing and avoid neglect…running water and flush toilets…safely carrying away waste and allowing the streets to be cleaned up.”
Chadwick was aided by advances in technology and modern consumer marketing. As the industrial age advanced, steam powered machines carved the pathways for miles of pipes. In homes and tenements, indoor plumbing appeared for the first time. While George Jennings was the first to patent the toilet in 1852, it was the master marketer, Thomas Crapper, who gave it its’ crude name, and mass merchandized it to the general public. Until its’ arrival regular outbreaks of cholera, a deadly diarrheal disease caused by food and water contaminated with the bacteria, Vibreo chlorea, were common.
Cholera was a scourge for more reasons that one. It signified human failure and disgrace. Here’s one description: “Cholera was an irredeemably filthy, foreign and lower-class disease. A cholera epidemic was degrading, vulgar, and stigmatizing, both for the victims and society that tolerated such squalor and poverty in its midst.”
During this period, medical science was on the march as well. The germ theory was fleshed out as a byproduct of organized communities of learning, growing scientific literacy, and accepted principles for experimentation and scientific validation. Learning was additive as illustrate by a threesome termed the “famous trio” – Louis Pasteur, Robert Koch, and Joseph Lister.
I. Louis Pasteur (1822-1895) was a chemist (not a biologist or physician) who famously stated that “Where observation is concerned, chance favors only the prepared mind.” In the 1850s, while investigating spoilage of wine and milk, he developed the theory that putrefaction and fermentation were the result of bacterial processes, and that these degrading actions could be altered by heat. Heat destroyed the bacteria, and prevented spoilage (ie. “pasteurization.”). This insight launched the field of “microbiology”.
Pasteur also helped define the principles of “nonrecurrence” (later called “acquired immunity”), and “attenuation”, a technique to render microbe causing disease specimens harmless when introduced as vaccines.
II. Robert Koch (1843-1910) was a physician 20 years younger than Pasteur. He investigated Anthrax in the mid-1870’s at the University of Gottingen. Luckily the bacterium was very large and visible with the microscopes of the day. He was able to reproduce the disease in virgin animals by injecting blood from infected sheep. But, in addition, he detected and described spores formed by the bacterium, which were left in fields where infected animals grazed. The spores, he declared were why virgin animals grazing in these fields, became infected.
Teaming up with the Carl Zeiss optical company, Koch focused on technology, improving magnification lens, developing specialized specimen stains, and developing fixed culture media to grow microbes outside of animals. Armed with new tools and stains, he discovered the Mycobacteria tuberculosis and proved its presence in infected tissues, and described his findings in “The Etiology of Tuberculosis” in 1882.
“Koch’s Postulates” became the 4 accepted steps constituting scientific proof of a theory.
1) The microorganism must be found in infected tissue.
2) The organism must be grown in fixed culture.
3) The grown organism must instigate infection in a healthy laboratory animal.
4) The infective organism must be re-isolated from the lab animal, and proven identical to the original microbes.
III. Joseph Lister (1827-1912), the third of our trio, was a professor of surgery at Edinburgh. At the time, major complications of surgery were pain, blood loss and deadly post-operative infections. In the 1840s, ether and nitrous oxide were introduced, controlling intra-operative pain. As for infection, Lister suggested scrubbing hands before surgery, sterilizing tools, and spraying carbolic acid into the air and on the patient’s wound. Koch took an alternate approach, advocating sterile operating theories and surgeons in gowns, gloves, and masks. The opposing techniques merged and became common surgical practice in the 1890s.
For New York City, Chadwick’s sanitary movement and the “famous trio’s” germ theory couldn’t arrive soon enough. The Civil War had been a wake-up call. Of the 620,000 military deaths, 2/3 were from disease. At the top of the list was dysentery and diarrheal disease, followed by malaria, cholera, typhus, smallpox, typhoid and others.
One Columbia University historian noted that “As New York City ascended from a small seaport to an international city in the 1899’s, it underwent severe growing pains. Filth, disease, and disorder ravaged the city to a degree that would horrify even the most jaded modern urban developer.”
At the beginning of the 19th century, New York was the home to only 30,000 citizens. But thereafter, it doubled every ten years, so that, at the beginning of the 20th century, 4 million people crowded into Manhattan south of 57th street. In 1880, the infrastructure can only be described as primitive, and the public square unhealthy and congested with man and beast, most notably horses. Some 200,000 horses lived in the city at the time producing nearly 5 million pounds of manure day. Country wide, cities were the home to 3 ½ million horses.
These, by any definition of the words, were primarily “work horses”, pulling all manners of goods, trolleys, and people up and down the streets teeming with people. Supporting the horses in 1880, as long as they lived, were 427 blacksmith shops, 249, carriage companies, 262 wheelwright shops, and 290 vendors selling saddles and harnesses. The feeding, grazing and housing of these animals was spotty at best. This and the labor load helped explain the reason why the average lifespan of an NYC horse in 1880 was 2 ½ years.
They were literally “worked to death.” 15,000 died on city streets in 1880. But dying didn’t come with a decent burial. Their average weight at death was 1,200 pounds, making their swift removal impossible. As a result, they were left to rot in the street where they fell, until their bodies fell apart and could be carted away, or eating by scavenging rodents and animals. Through much of the 1800’s, pigs, sheep, cattle, and dogs wandered freely on the city streets.
Until bridges were built from Manhattan to Queens, the manure in New York gathered knee deep in downtown streets. After that, some effort was made to collect and cart the waste to Queens where it piled up in “manure blocks”, that is entire city blocks reserved for this purpose. Street sweeping machines and a formal Sanitary Department were already in place, but largely overwhelmed.
Human waste was handled with casual disregard as well. There were no sewers or flush toilets. Garbage was regularly dumped out windows onto the city streets. Chamber pots or basins were used to collect “night soil” which was supposed to be dumped into an outdoor privy, but a window was more convenient for upper floor tenement dwellers. Medical authorities were increasingly vocal blaming “a combination of certain atmospheric conditions and putrefying filth,” for the outbreaks of cholera, smallpox, yellow fever, or typhoid.
The noise level was deafening – loud enough to cause the city to pass an ordinance outlawing wagons with iron-shod wheels in 1865. As a new century dawned, motorized vehicles looked better and better. Certainly there was an economic argument made by commentators at the time. One wrote, “It is all a question of dollars and cents, this gasoline or oats proposition. The automobile is no longer classified as a luxury. It is acknowledged to one of the great time-savers in the world.” And there was an equal argument to be made for public health where increasingly the horse was viewed as a “public health menace”, and the origin point for outbreaks of cholera, typhoid, and dysentery.
By the 1920’s, motor cars and trucks had overtaken the horse, and sewer systems and indoor plumbing were no longer a novelty. But the first two decades of the 20th century were a struggle with infectious diseases for public health officials in many American cities. Infants were especially vulnerable. Deaths in children under 5 represented 40% of all deaths in 1900. The leading causes were pneumonia, dysentery, tuberculosis, and diphtheria. The offending animal this time wasn’t the horse, but rather the cow.
This story begins with a young radicalized German doctor named Abraham Jacobi. Later known as the father of American pediatrics, Jacobi left Berlin for New York after being jailed two years for high treason during the political turmoil of the early 1850s in his native land. An early advocate of birth control and a socialist who corresponded with Karl Marx, Jacobi followed Louie Pasteur closely and saw great promise in his discovery of the benefits of pasteurization.
On arrival in the U.S., Jacobi fought the notion that raw milk was safe for infants and took up the challenge of convincing a skeptical public that heating milk until bubbles appeared would save lives. At the time, dairy farmers in New England, to save money, were using alcohol distillery refuse or swill as feed for their cows. This caused intestinal and stomach ulcers in the cows and a range of secondary infections.
The thin translucent swill milk sold in tenements for as little as 6 cents a quart but was teeming with microbes. Jacobi quickly recognized milk as the source point for New York City epidemics of diphtheria, tuberculosis, typhus, dysentery and more. He decided to take action.
If Jacobi is often credited with bringing pasteurization to the United States, a close friend and fellow German émigré named Nathan Straus was arguably its biggest booster. In 1892 the New York City businessman and philanthropist opened the Nathan Straus Pasteurized Milk Laboratory and soon introduced the first low-cost pasteurized milk depots for the city’s poor. At the time, infant mortality stood at 240 per 1000 live births.
The pasteurization plant found that heating the milk to 146 degrees Fahreinheit for 30 minutes followed by immediate cooling at 40 degrees rendered it free of deadly germs and safe for consumption. By 1909, Chicago became the first major city to outlaw non-pasteurized milk, and New York City, after a deadly typhoid epidemic, followed suit 5 years later. Within seven years, infant mortality declined by 70%.
During this same period, the science of vaccination grew in leaps and bounds. The disease that sparked the beginnings of vaccination first appeared 10,000 years ago. Marks of the disease, Smallpox, appear on the mummified head of the Egyptian pharaoh Ramses V (died 1156 BC). In AD 108 it was linked to the first stage of the decline of the Roman Empire and reportedly killed nearly 7 million. By the 5th century, the disease had spread throughout Europe.
With the Columbian Exchange, Smallpox rather than the fighting skills of conquistadores led to the destruction of the Aztecs and Incas. The same disease ravaged native American populations, and then circled back to Europe, killing an estimated 400,000 per year with reported case fatality rates (CFR) ranging from 20% to 60%, and rising to 80% in infants. One bright light was that survivors of the disease appeared to be immune to re-infection. In 430 BC, communities took advantage of this fact, recruiting survivors to serve as caregivers to those actively infected.
By the 18th century, beginning in Turkish harems, a practice called variolation, or superficial inoculation by scraping the skin with smallpox soiled needles, came into vogue. 2% to 3% died from the practice, but this was some 20 times less than the mortality rate for natural infections. By introducing minimal exposures, most survived and were rendered immune.
The practice received continent wide exposure thanks to English aristocrat, Mary Wortley Montague, whose husband was ambassador to the Ottoman Empire. She herself had contracted the disease leaving her exquisite face disfigured. The disease had also claimed her brother, after which she had the embassy surgeon inoculate her 4 and 5 year old children. Upon her return to England, she supported experiments on prisoners and orphaned children which proved its success. In England’s structured environment, the practice then spread fairly rapidly throughout England.
It’s popularization in the North American colonies was far more delayed. An early convert was the Rev. Cotton Mather who took action after a West Indies ship ignited an epidemic in Boston in 1721. He enlisted noted physician, Dr. Zabdiel Boylston, who stirred up so much controversy that Mather’s home was bombed. George Washington was one of the first to see its utility in times of war. The British led troops he met in Quebec in 1776 were all protected by prior variolation. Washington’s troops were not, and he was forced to retreat when the disease ravaged his camps.
On February 5, 1777, Washington did an about face, with this explanation:
“Finding the Small pox to be spreading much and fearing that no precaution can prevent it from running through the whole Army, I have determined that the troops shall be inoculated. The Expedient may be attended with some inconveniences and some disadvantages, but yet I trust in its consequences will have the most happy effects.”
Ironically, the “advantage” that Washington sought scientifically could be attributed to a Brirtish physician, Edward Jenner, who observed that milkmaids with cowpox (caused by a relative to the smallpox virus) were immune to a cross over of the human version of the disease. Taking a sample from the pustulous arm of a milkmaid, Sarah Nelms, and inoculated an 8-year old boy, James Phibbs. The injection, which he termed “vaccination”, resulted in protection from subsequent deliberate attempts to infect the boy with live smallpox. Better yet, the original inoculation yielded few symptoms.
As one expert wrote, “Jenner’s work represented the first scientific attempt to control an infectious disease by the deliberate use of vaccination. Strictly speaking, he did not discover vaccination but was the first person to confer scientific status on the procedure and to pursue its scientific investigation.”
For another century, following the Revolutionary War, mandatory vaccination remained highly controversial in America. In 1905, Massachusetts was one of 11 states that required compulsory vaccinations. The Rev. Henning Jacobson, a Lutheran minister, challenged the city of Cambridge, MA, which had passed a local law requiring citizens to undergo smallpox vaccination or pay a $5 fine. Jacobson and his son claimed they had previously had bad reactions to the vaccine and refused to pay the fine believing the government was denying them their due process XIV Amendment rights.
In deciding against them, John Marshall Harlan wrote, “In every well ordered society charged with the duty of conserving the safety of its members the rights of the individual in respect of his liberty may at times, under the pressure of great dangers, be subjected to such restraint, to be enforced by reasonable regulations, as the safety of the general public may demand … liberty for all could not exist under the operation of a principle which recognizes the right of each individual person to use his own [liberty], whether in respect of his person or his property, regardless of the injury that may be done to others.”
Slam dunk, except for two things:
1.In 1927, the Jacobson decision provided judicial cover for Buck v. Bell. That case centered around a young Virginia girl who supporters of the Eugenics movement accused of being a product of a long line of “mental deficients.” They asked the courts to impose involuntary sterilization. In his decision, Justice Oliver Wendall Holmes wrote, “The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes (Jacobson v Massachusetts, 197 US 11). Three generations of imbeciles are enough.” Based on that decision, 24 states moved forward and sterilized 60,000 women in the next three years.
2. A state’s right to legislate compulsory public health measures does not require them to do so. In fact, as we have seen in Texas and Florida among others, they may decide to do just the opposite – declare life saving mandates (for masks or vaccines) to be unlawful. At least 14 states have passed laws barring employer and school vaccine mandates and imposing penalties in Republican controlled states already.
Back in 1905, when Judge Harlan was siding with mandatory sterilization, New York City public health officials were still 9 years away from requiring all milk to be pasteurized. Unpasteurized raw milk was not only a carrier for the Smallpox variola virus, but a range of other microbes capable of unleashing epidemics. One especially dreadful one, due to its lethality and ability to target infants, was diphtheria. The disease was named by a French physician, Pierre Bretonneau, in 1821. The French “dipherite” derives from the Greek word diphthera meaning leather. This referred to the diseases “signature physical feature, a thick, leathery buildup of dead tissue in a patient’s throat.”
As described in Paul de Kruif’s classic 1926 book, Microbe Hunters: “The wards of the hospitals for sick children were melancholy with a forlorn wailing; there were gurgling coughs foretelling suffocation; on the sad rows of narrow beds were white pillows framing small faces blue with the strangling grip of an unknown hand.”
In the 1880’s in New York, Abraham Jacobi and his physician wife, Mary Putnam Jacobi, were focused on diphtheria which would ultimately claim their 7 year old son, Ernst. The same year as his death, 1883, the bacteria, Corynebacterium diphtheriae, was identified by Prussian pathologist, Edwin Klebs, and five years later the offending deadly toxin produced by the microbe was isolated.
A German scientist, Emil von Behring, had been experimenting with tetanus toxin, repeatedly injecting it into rabbits and mice and then harvesting it from the experimental animals. He established that weakened or “attenuated” toxins could stimulate an effective immune response in humans. At the time diphtheria was killing 60,000 children a year in the German empire. Using a similar technique with the diphtheria toxin, von Behring developed a commercial anti-toxin in 1894 for which he was awarded the first ever Nobel Prize in Medicine in 1901. Within a year of its purification, New York’s Board of Health had procured 25,000 doses.
New York City took a unified approach to epidemic control at the time. As we’ve seen, they attacked filth in streets and tenements, and aggressively pursued modern infrastructure and plumbing. They embraced scientific advances deliberately addressing both cause, by actions such as mandatory pasteurization in 1914, and solution through procurement of both tetanus and diphtheria anti-toxin with support from the American Red Cross and the Metropolitan Life Insurance Company. And they relied on traditional methods of quarantine placing isolation placards on tenements in response to outbreaks of measles, scarlet fever and diphtheria.
The much maligned city horse took on hero status as an experimental reservoir for the natural production of anti-toxin against both tetanus and diphtheria. This drug had been produced by injecting horses with progressive doses of toxin, then harvesting the horses’ antibody-rich serum, purifying it, and injecting it into humans to produce immunity. Largely successful as a source animal, significant missteps were heavily publicized and led to groundbreaking legislation. Selling toxic drugs had never been a good business model, but in fact the practice was not actually prohibited until Congress passed the Biologics Control Act of 1902, and even then the death of a five-year-old St. Louis girl was needed to spur action. The little girl had come down with tetanus a few days after receiving a diphtheria antitoxin. Shortly after the girl’s death, her two siblings met a similar fate, and an investigation revealed that a donor horse had tetanus.
These deaths caused an uproar, and ensuing public pressure led to the 1902 law that regulated the sale of serums and toxins involved in interstate or foreign commerce. An ill-prepared US Public Health Service was put in charge of licensing all producers of biological products applicable to human disease. While the horse played an historic role in combating childhood diphtheria, another animal, the dog, received top billing thanks to imaginative story telling, popular print journalism, and statuary flourish.
In the closing days of 1924, Alaskan physician Curtis Welch detected a growing epidemic of diphtheria in Native Americans in Nome. He quickly ordered a shipment of antitoxin but the harbor was already frozen over. As the epidemic gained steam, Welch pressed for some way to ship the 300,000 waiting units from Anchorage, 674 miles away, where they sat in storage. In response, Governor Scott Bone organized a sled dog rally, using 20 dog team volunteers. Newspapers picked up and dramatized the story, as did Disney in the film film, Togo. But the highest honor went to the lead dog on the final leg, Balto, whose bronze statue stands proudly to this day in New York’s Central Park. The epic effort is also remembered each year (since the late 1960’s) in the epic sled dog race, the Iditarod, retracing the route taken to Nome in 1925.
Another Alaskan village weighs heavily on the history of epidemics in America – Brevig Mission. This small ocean side village was the home of several hundred Inuit Natives in 1918. In a 5-day period, between November 15 and 20, 72 of the 80 adult inhabitants perished. The tragedy was part of a larger disaster, the 1918 Flu Pandemic which claimed 675,000 deaths in the U.S. and an estimated 50 million nationwide while infecting one third of the world’s population. An H1N1 virus, similar to the 2009 Bird Flu nearly a century later, it was especially lethal in young adults age 15 to 34. In the absence of a vaccine, and without access to antibiotics to treat opportunistic bacterial infections, with a war raging, large military encampments and crowded troop ships promoting transmission, this singular event overnight dropped lifespan in America by 12 years.
Ultimately, surviving Americans recovered after two years of masking, quarantines, and restricted movements. There followed the creation of additional vaccines and a universal program of infant vaccination in the United States in the 1920’s and 1930’s. By 1940, efficient delivery and defined schedules were mandated. One injection DPT, combined the diphtheria (D), and tetanus (T) anti-toxins with proteins derived from the Pertussis (P) bacteria that caused whooping cough.
By the 1950’s, led by the FDR inspired March of Dimes, the Salk and Sabin vaccines came into play finally addressing the scourge of Polio, caused by a virus that targeted motor nerves, and affected 52,000 mostly children in the U.S. in 1952.
As for that Alaskan village, Brevig Mission, its mass gravesite, preserved in permafrost, remained untouched until 1951, when Johan Hultin, a 25-year-old Swedish microbiologist and Ph.D. student at the University of Iowa, was granted permission to excavate the site in order to obtain lung tissue from one of the victims. Hultin was unsuccessful in his attempts to grow the virus once he thawed the frozen tissue. But forty six years later, in 1997, the genomic structure of the single strand of RNA that is the virus was unraveled by scientist Jeffrey Taubenberger. Its donor was a 21 year old South Carolina serviceman who died of the disease on September 20, 1918. On reading a review, Hultin, now 72, contacted Taubenberger, and under the auspices of the Armed Forces Institute of Pathology, they returned together to Brevig Mission, and were able to obtain frozen lung specimens that definitely identified the killer microbe as the 1918 H1N1 bird flu virus.
By now, America’s scientists were beginning to declare victory over infectious diseases. General George Marshall had gotten the ball rolling when he declared in 1948 that we now had the means to eradicate infectious disease. Seven years later, Rockefeller Foundation scientist Paul Russell, who along with Fred Soper had championed the use of DDT to eradicate mosquito’s that carried microbes causing Yellow Fever and Malaria, published “Mastery of Malaria”, recommending a global spraying campaign – which Rachel Carson successfully opposed.
Eight years after that in 1963, Johns Hopkins scientist, Aidan Cockburn, published his seminal piece, “The Evolution and Eradication of Infectious Diseases”, in which he memorably declared, “With science progressing so rapidly, such an endpoint (of infectious diseases) is almost inevitable.” And finally, in 1969, Surgeon General William H. Stewart, declared with complete confidence that it was time to “close the book on infectious diseases.”
Yale historian Frank M. Snowden explained in his book, Epidemics and Society that the two decades following the end of WW II were years of “social uplift.” This was a period that marked progress (for the fortunate) in housing, wages, diet, and education. In infrastructure as well – from roads, to sewers, to water treatment plants, and safer manufacturing equipment – there was some justification for the self-congratulatory waves in the air.
The infectious diseases themselves seemed stalled, static, relatively benign and historic. Plague had yielded to sanitary cordons, isolation, and quarantine. Water and sewer management had neutralized the threat of cholera in most locations. DDT, paired with quinine, had defanged malaria. And vaccines for just about every nasty childhood disease were now required for school entry. As Snowden describes, we “fell victim to historical amnesia.”
When HIV arrived in the early 1980’s, it proved every stereotype about the manageability of infectious diseases false. Here was a brand new infection, impacting both the developed and developing world, which spread rapidly far and wide, had a devastating and tortuous kill rate, and ignited a wide range of associated opportunistic infections.
On June 5, 1981, the CDC issued its first report in the Morbidity and Mortality Weekly Report but posted it on page 2, with no mention of homosexuality in the title. The header read simply: “Pneumocystis pneumonia—Los Angeles.” But the editorial note at the bottom of the entry issued a clear warning: “Pneumocystis pneumonia in the United States is almost exclusively limited to severely immuno-suppressed patients. The occurrence of Pneumocystosis in these 5 previously healthy individuals without a clinically apparent underlying immunodeficiency is unusual. The fact that these patients were all homosexuals suggests an association between some aspect of a homosexual lifestyle or disease acquired through sexual contact and Pneumocystis pneumonia in this population.”
On April 13, 1982, three months into C. Everett Koop, the new Surgeon General’s tenure, Senator Henry Waxman held the first congressional hearings on the implications of this new disease. The CDC testified that very likely tens of thousands were already infected. On September 24, 1982, the condition was for the first time identified as acquired immune deficiency syndrome, or AIDS.
Koop had been successful in his first months on the job in raising the esprit de corps of the 5,600-person Public Health Corps, who were now required to wear uniforms to enhance their visibility and ideally contribute to the long-term funding viability of the institution. But in the most pressing public health challenge of the day, HIV/AIDS, the Public Health Corps was ordered by President to stand down.
President Reagan never uttered the terms “HIV” or “AIDS” in public for six more years. As more and more people died—not only gays but recipients of blood transfusions, intravenous drug users, newborns of infected mothers, and a surprising number of Haitians—Reagan’s silence became deafening.
Inside his administration, Reagan empowered people like Secretary of Education Bill Bennett, who discouraged providing AIDS information in schools, and Christian evangelical leader Gary Bauer, who served as his Domestic Policy Adviser, and whom Koop said “believed that anybody who had AIDS ought to die with it. That was God’s punishment for them.” Even William F. Buckley, the voice of a more genteel, old-school conservatism, suggested in a New York Times article that HIV-positive gay men have their disease status forcibly tattooed on their buttocks.
By October 1986, when Reagan first uttered the term “AIDS,” more than 16,000 Americans had died. Jerry Falwell had declared the disease “the wrath of God upon homosexuals.” Former Nixon speechwriter and conservative firebrand Pat Buchanan cruelly labeled the disease “nature’s revenge on gay men.”
On April 10, 1987, with a federal budget calling for an 11 percent cut in AIDS spending from the prior year, actress Elizabeth Taylor had had enough. She went beyond Washington game-playing and gained access to political power based on Hollywood star power and a long-standing friendship. She invited Ronald and Nancy Reagan to a dinner given by the American Foundation for AIDS Research. The Reagan’s accepted, and on the evening of May 31, 1987, with 21,000 Americans dead and 36,000 more living with a diagnosis of HIV/AIDS, Reagan delivered his first major address on the topic—six years late.
As the numbers of dead and infected rapidly rose, the public was becoming more and more fearful. Since a test had been developed to detect the virus in blood, politicians, and some hospitals and physicians, were calling for testing to become mandatory in the interest of protecting health care workers. This followed heavily publicized cases of death from HIV-tainted blood and fears that the entire US blood supply might be at risk.
On December 17, 1984, a 14-year-old hemophiliac from Kokomo, Indiana, named Ryan White had undergone a partial lung removal for severe consolidated pneumonia, after which he was diagnosed with HIV/AIDS. He had been infected while receiving an infusion of a blood derivative, factor VIII. When he was cleared to return to school, 50 teachers and more than a third of the parents of students from his school signed a petition asking that his attendance be barred. After the state’s health commissioner and the New England Journal of Medicine reinforced that Ryan White’s disease could not be spread by casual contact, he was readmitted in April 1985.
Koop clearly understood that continued inaction on his part would be unacceptable. He went far and wide collecting data without exposing his own bias. He interviewed AIDS activists, representatives from medical and hospital associations, Christian fundamentalists, and politicians from both sides of the aisle, but he kept his cards close to his chest, and few knew exactly what he thought or planned.
On October 22, 1986, the report was officially released, and it was, at least to conservative Christian tastes, shockingly explicit. The surgeon general challenged parents and schools to discuss AIDS, to offer sex education in the public schools, and to promote the use of condoms for prevention. The report drew immediate criticism from conservatives, but nothing compared with the furor that arose 19 months later. Koop had hired the public relations firm Ogilvy and Mather to make certain he had the messaging, language, and imaging right. He then procured funding from private sources and from various branches of government to support the mass mailing (107 million copies, enough to fill 38 boxcars) of an eight-page pamphlet to every household in America. The huge print run required government printing presses to operate 24 hours a day for several weeks.
“Understanding AIDS” was frank and factual. The pamphlet promoted sex education beginning in elementary school and challenged the current messaging of the televangelists with this comment: “Who you are has nothing to do with whether you are in danger of being infected with the AIDS virus. What matters is what you do.”
The medical community applauded loudly, as did the press and the majority of the public. When Koop’s original religious patrons and their captive senators went after him, he responded, “I’m the nation’s doctor, not the nation’s chaplain.”
Absent a vaccine, nearly 40,000 Americans a year continue to contract HIV/AIDS, and roughly 16,000 die each year. Nearly 1.2 million U.S, citizens live with HIV, and survive thanks to preventive efforts and effective anti-viral therapy. Overall, the disease has killed roughly 800,000 Americans.
Science has a way of punishing humans for their arrogance. In 1996, as treatment for HIV/AIDS was turning the corner, Dr. Michael Osterholm found himself rather lonely and isolated in medical research circles. This was the adrenaline infused decade of blockbuster pharmaceuticals focused squarely on chronic debilitating diseases of aging rather than pandemic infectious diseases.
.
And yet, there was Osterholm, in Congressional testimony delivering this message: “I am here to bring you the sobering and unfortunate news that our ability to detect and monitor infectious disease threats to health in this country is in serious jeopardy…For 12 of the States or territories, there is no one who is responsible for food or water-borne surveillance. You could sink the Titanic in their back yard and they would not know they had water.”
Osterholm’s choice of metaphor perhaps reflected his own frustration and inability to alter the course of the medical-industrial complex despite microbial icebergs directly ahead. As we’ve seen, for nearly a half-century, America’s scientists had been declaring victory over infectious diseases. From General George Marshall in 1948 to Surgeon General William H. Stewart, complete confidence was declared that it was time to “close the book on infectious diseases.”
In its wake of HIV/AIDS, the scientific community began to reverse course. In 1992, the Institute of Medicine (IOM) served notice with the publication of “Emerging Infections: Microbial Threats to Health in The United States.” Two years later, in 1994, the CDC declared “The public health infrastructure of this country is poorly prepared for the emerging disease problems of a rapidly changing world.”
In 1998, the Department of Defense weighed in. saying “Historians in the next millennium may find that the 20th century’s greatest fallacy was the belief that infectious diseases were nearing elimination. The resultant complacency has actually increased the threat.”
They personified the threat of these organisms as the enemy of mankind, explaining that there were “powerful evolutionary pressures on these micro-parasites.” Their analysis revealed intense mixing of microbes gene pools, highly crowded and impoverished non-immune urbanized populations, growing high speed travel (including almost 2 billion air passengers worldwide that year), populations displaced and vulnerable due to warfare, the absence of health care services in many areas, and growing environmental degradation. And in the middle of this human mess were tens of thousands of different viruses and some 300,000 different bacterial species capable of attacking humans.
In a JAMA article in 1996, Nobel Laureate, Joshua Lederberg, alerted the public that our fight with microbes was far from over, and that the odds were severely tipped in the microbes favor. The IOM 1992 report that he authored had noted that they outnumber us by a billion fold, and mutate a billion times more quickly than us. “Pitted against microbial genes”, Lederberg wrote, “we have mainly our wits.” He coined the term “emerging and reemerging diseases” to encompass historic infectious diseases as well as newcomers like HIV/AIDS.
Eradication of infectious diseases was now a dream of the past. We had been warned and re-warned. But as Ebola and SARS arrived in the early days of the new millennium, the scientific communities in the U.S. and around the world were anything but sure-footed. Slowly policy leaders were awakening to the global nature of the threat. The George W. Bush administration in 2003 created the President’s Emergency Plan for AIDS Relief (PEPFAR) and the President’s Malaria initiative (PMI).
The WHO promoted early detection and notification obligations after China delayed notifying the world of its 2002 detection of SARS. They had dragged their feet for almost four months. In 2004, British Columbia, in Canada ordered the destruction of 19 million birds on 20 poultry farms to halt the H7N2 strain of Bird Flu. A cousin strain of the microbe had already jumped to humans in Asia and caused 23 deaths.
This brings us full circle back to 1997, and the elucidation of the RNA genomic structures of the 1918 virus by scientist Jeffrey Taubenberger. Johan Hultin, a 25-year-old Swedish microbiologist and Ph.D. student at the University of Iowa, was now 72 and organized a return to Alaska to collect addition remains of the victims buried in permafrost. This definitively proved that both Taubenberger’s samples from South Carolina and Hultin’s from Alaska were infected by the same H1N1 virus.
The next highly controversial step was to attempt to reverse engineer the virus back to life. The microbiologist assigned to complete this task was the Department of Agriculture investigator, Terrence Tumphy. He successfully brought the virus back to life in 2005 working in a CDC BSL3 (the second highest security level) Lab. A BSL3 laboratory contains primary and secondary barriers. Investigators wear a powered air purifying respirator (PAPR), double gloves, scrubs, shoe covers and a surgical gown, and shower before exiting the laboratory. The work is conducted within a certified Class II biosafety cabinet (BSC), which prevents any airflow escape into the general circulation.
Why did they decide to take the risk of recreating such a deadly germ? The answer goes back to Dr. Lederberg’s remarks in 1992 regarding the battle between microbes and humans, and the statement that “we have mainly our wits.” These scientists made the judgment that they had to stay one step ahead of the germs. By recreating the virus, they were able to study it and probe for weaknesses that might be advantaged in future viral outbreaks.
What did they learn from these experiments? What they learned was that the viruses rapid multiplication created a virus load that was 50 times as great as every day respiratory viruses and specifically and almost exclusively targeted lung tissue. These deadly attributes were the result of 8 separate but contributory unique mutations of the genetic structure.
But the 2019 CDC report also predicted trouble ahead. It specifically raised the question whether a 1918 level viral pandemic could run amuck in modern times. As the report stated then, “Many experts think so. One virus in particular has garnered international attention and concern: the avian influenza virus from China.” In elaborating on the wide range of progress in surveillance, testing, vaccines, and treatment, they still admitted vulnerability
From the report:
“While all of these plans, resources, products and improvements show that significant progress has been made since 1918, gaps remain, and a severe pandemic could still be devastating to populations globally. In 1918, the world population was 1.8 billion people. One hundred years later, the world population has grown to 7.6 billion people in 2018.3
“As human populations have risen, so have swine and poultry populations as a means to feed them. This expanded number of hosts provides increased opportunities for novel influenza viruses from birds and pigs to spread, evolve and infect people. Global movement of people and goods also has increased, allowing the latest disease threat to be an international plane flight away.
“Due to the mobility and expansion of human populations, even once exotic pathogens, like Ebola, which previously affected only people living in remote villages of the African jungle, now have managed to find their way into urban areas, causing large outbreaks.
“If a severe pandemic, such as occurred in 1918 happened today, it would still likely overwhelm health care infrastructure, both in the United States and across the world. Hospitals and doctors’ offices would struggle to meet demand from the number of patients requiring care. Such an event would require significant increases in the manufacture, distribution and supply of medications, products and life-saving medical equipment, such as mechanical ventilators. Businesses and schools would struggle to function, and even basic services like trash pickup and waste removal could be impacted.”
That report turned out to be prophetic. In October, 2019, three microbiologists working at the high security viral labs in Wuhan, China fell ill. The lab was directed by Shi Zengli, an expert virologist, who had trained in viral recombinant research with lead scientist Richard Baric at the University of North Carolina.
Zhengli and Baric had teamed up in November, 2015 to manipulate the crucial spike protein of the SARS virus creating “chimera” – possessing genetic material from two different viral strains. At the time, other scientists were sounding alarms including Pasteur Institute’s Simon Wain-Hobson who wrote “If the virus escaped, nobody could predict the trajectory.”
The risky experiments, termed “gain-of-function” studies, were justified as super-secure, safe, predictive, and preventive. Shi Zengli returned to her labs in 2018 and 2019 with grant funding from the National Institute of Allergy and Infectious Disease. Funding such high-risk research was so controversial that the funds were passed through an intermediary.
Their coordinator-in-chief was one Peter Daszak, chartered power broker within the U.S. Medical Industrial Complex and president of New York based EcoHealth Alliance which was a major funder of Shi Zhengli’s work in Wuhan.
Daszak is known for adopting militarized terms in the battle against global infectious diseases. In 2020 he wrote in the New York Times, “Pandemics are like terrorist attacks: We know roughly where they originate and what’s responsible for them, but we don’t know exactly when the next one will happen. They need to be handled the same way — by identifying all possible sources and dismantling those before the next pandemic strikes.”
Daszak’s argument that risks involved in Shi Zhengli’s Wuhan bat virus research were justified as defensive and preventive was convincing enough to the NIH and the Department of Defense that his EcoHealth Alliance was funded from 2013 to 2020 (contracts, grants, subgrants) to the tune of well over $100 million – $39 million from Pentagon /DOD funds, $65 million from USAID/State Dept., and $20 million from HHS/NIH/CDC.
As veteran Science reporter Nicholas Wade deciphered in a classic article in Science – The Wire, “For 20 years, mostly beneath the public’s attention, they had been playing a dangerous game. In their laboratories they routinely created viruses more dangerous than those that exist in nature. They argued they could do so safely, and that by getting ahead of nature they could predict and prevent natural ‘spillovers,’ the cross-over of viruses from an animal host to people.”
Shi returned to her labs in 2018 and 2019 with grant funding from National Institute of Allergy and Infectious Disease. The experiments she directed initially were conducted in BSL Level 2 Biosecurity Labs, one level less rigorous than those that recreated the 1918 virus. Wade’s concern was that Shi was conducting experiments that might create and inadvertently release viruses with “the best combination of coronavirus backbone and spike protein for infecting human cells.”
Seven years earlier, a group of concerned scientists called the Cambridge Working Group issued this statement: “Accident risks with newly created ‘potential pandemic pathogens’ raise grave new concerns. Laboratory creation of highly transmissible, novel strains of dangerous viruses, especially but not limited to influenza, poses substantially increased risks. An accidental infection in such a setting could trigger outbreaks that would be difficult or impossible to control.”
We may never know for certain whether the virus launched from Shi’s lab or from Wuhan’s wild life open markets in the same vicinity. But as Wade sees it, “The US government shares a strange common interest with the Chinese authorities: neither is keen on drawing attention to the fact that Dr. Shi’s coronavirus work was funded by the US National Institutes of Health.”
In America, entering its 3rd year of the Covid-19 pandemic, we are now approaching 1 million casualties, including nearly 6 million worldwide. It is estimated that the infection has struck more than 2/3 of the human population. President Trump’s erratic response, advancing quack cures, and discrediting public health leadership hampered us from the start. As predicted, our health care system has buckled under the demands imposed by the disease, and without adequate planning, has been unable to efficiently share personnel or supplies in the early phases. Science has over-performed, delivering effective vaccines in less time than thought possible, and manufacturing and distributing home testing to all homes nationwide. But the use of these lifesaving vaccines has become highly politicized, reawakening the same types of debated that occurred more than a century ago in Massachusetts during its smallpox epidemic.
In the middle of this pandemic, in 2021, Professor Joseph Wu and his colleagues at the WHO Collaborating Centre for Infectious Disease Epidemiology and Control, published a paper in Nature titled “Nowcasting epidemics of novel pathogens: lessons from COVID-19. ” As they explain, the term “nowcasting” broadly refers to “assessing the current state by understanding key pathogenic, epidemiologic, clinical and socio-behavioral characteristics of an ongoing outbreak.” In its most elemental level, the paper is a call for better preparedness. What are our vulnerabilities in 2022? They include: high speed travel, global warming, inadequate independent scientific oversight, worldwide conflict-driven migrations, poverty and inequity, and inadequate and inefficient health care systems. In summary, there remains a great deal to be done.
So what have we learned over the past hour by reviewing the epidemic history of America? Here is a short list:
- Epidemics, as historians have emphasized are “social, political, philosophical, medical, and above all ecological events.
- Competing and complimentary species cycles in pursuit of nutrition and reproduction maintain, or distort, ecological balance.
- Populations initially respond to epidemics with fear and flight. Scapegoating and societal turmoil are common features. Diseases disadvantages the poor, the weak, and those without immunity or prior exposure.
- Epidemics often travel side by side with warfare in transmitting and carrying microbes, and exposing vulnerable populations. Historically, epidemics have repeatedly played a role in determining the ultimate outcomes of warfare and conflict.
- Throughout history, scientific advances have enabled (through travel , congregation, and entry into virgin territory) epidemics, and also provided the knowledge and tools to combat epidemics.
- Domestication and sharing of animals has enhanced the introduction of microbes to populations vulnerable to epidemic disease.
- Disease, rather than aggression, has been the major factor in native cultures and decimating native populations in the Americas.
- Slavery was largely a response to workforce demands created by the epidemic eradication of native populations intended to serve as indentured servants on large agricultural plantations that raised and exported highly lucrative products into Old World markets.
- Epidemics often result in unintended consequences. For example, Yellow Fever and the defeat of the French in Saint-Domingue led to Napoleon’s divestment of the Louisiana Territory. Struggles to control and explain the Yellow Fever outbreak in Philadelphia in 1793 helped define the emergence of two very different branches of American Medicine over the next century.
- Scientists defining “germ theory” and social engineers leading the “sanitary movement” reinforced each other’s efforts to lessen urban centers vulnerability to epidemics.
- Immunization has a long history,and at times in history has been controversial. As enlightened public policy, it has saved many lives. It can, as illustrated by the Eugenics Movement, create uncomfortable legal precedents and unintended consequences.
- The U.S. scientific community prematurely declared victory over communicable diseases.
- In the wake of HIV/AIDS, some scientific leaders actively warned of ongoing population wide vulnerabilities beginning in 1992.
- Genetic reverse engineering technologies empowering “gain-of-function” led to Consensus Statements in 2014 of potential disastrous consequences, and epidemics that would be difficult to control.
- The U.S, Health Care System in leadership, strategic operation, mitigation, and delivery of acute services failed on a large scale when confronted with the Covid-19 pandemic.