Mike Magee MD
As we discussed in session I of this three part course, it is an impossible task to rank list the top ten medical discoveries that have improved our human health and welfare over the centuries. Taking on this challenge is filled with twists and turns.
In sharing this story of human development, we have already encountered artisans and dreamers and determined spirits. Their discoveries have had few geographic bounds, crossing continents on their own time schedules and traveling well, but not without bumps in the road. In most cases, the discoveries have usually served as their own reward, and quite often have not been accompanied by fame and fortune, at least not immediately, or even in a single lifetime.
Session I focused on Basic Science. In Session II, we explore the emergence of Public Health as a discipline, and witness the remarkable gains in quality and quantity of life, spread unequally across the globe. In our final and third session, we’ll focus on Molecular Medicine. This is largely the realm of medical technology and its remarkable impact on discovery, innovation and progress which continues up to our present time. What all three sessions share is a persistent human struggle to see, to feel, to measure, and ultimately to understand.
Germs have been a source of concern to humans long before the invention of tissue cultures or the discovery of penicillin. Smallpox for example existed and was prominently immortalized in the scarred mummified face of Ancient Egyptian Pharaoh Ramses V. During prominent epidemics, the highly contagious affliction is thought to have claimed up to 10% of the existing human population. Its’ threat was significant enough to draw the attention of Lady Mary Wortley Montague, wife of British Ambassador to Constantinople (Istanbul). In 1716 she witnessed local Turkish sects inoculating groups of congregants with smallpox. She had her small son inoculated prior to their return to Britain. Once there, she arranged for her daughter to also be inoculated, this time before the King’s private physician. Both children were safely protected from future contagion. And yet, resistance was strongly opposed to the practice.
The same was the case in the American colonies. In 1716, Benjamin Franklin refused the smallpox inoculation of his 4 year old son who subsequently died from the infection. The early colonists had blamed the scourge on Native Americans after 20 Mayflower settlers died of the infection. By 1721, the religious leader from Massachusetts, Cotton Mather, was promoting what he called variation, using the smallpox scrappings from his personal slave, Onesimus. At the time there were already 800 deaths that year from the current epidemic and a 14% mortality rate. Mather’s experiment showed the positive effect of the procedure that drew down mortality to 2.5%. But it so incensed the locals that they stoned his house and posted the famous message “A Pox to You!” But George Washington embraced the practice. His mandatory inoculation of the Continental Army in 1777 is viewed as contributing to his military victory over the non-inoculated British soldiers.
A few years later in 1796, British physician, Edward Jenner, went one step further. He had noticed that the milkmaids on his farm had developed Cow Pox pustules on their hands from contact with infected Cow utters. But their faces and bodies showed no lesions. He surmised that Cow Pox and Small Pox were related, but that the former was less virulent to humans. So he surmised that exposure to Cow Pox might build a natural resistance to the worse infection. So he convince his gardener to offer up his 8-year old son for an experiment. First he inoculated him with Cow Pox, to which he had a mild reaction. Later he inoculated him with Smallpox and he did not contract a virulent infection. This, and his willingness to inoculate his own 11-month old child, earned him the title of “Father of Immunology.”
But progress was slow. When Yellow Fever broke out shortly after the arrival of a trading ship from Saint-Dominque in Philadelphia among colonists with no immunity in 1793, the main response was panic, fear, and mass evacuation from the city. Five thousand citizens, roughly 10% of the population, including Alexander Hamilton and his wife, fled. Experts were at a loss to explain the cause, and were even more confused how to treat the disease.
Benjamin Rush was the primary leader of Medicine in the young nation where physicians were scarce and nearly all relied on Europe for their advanced education. In confronting the epidemic, Rush believed the problem involved a disturbance of the four humors – blood, phlegm, black bile and yellow bile. His solution was the rather liberal and barbaric use of cathartics and blood letting. For the most severe cases, Rush warehoused dying victims in the first “fever hospital” in the new nation.
But a century later, as historian Frank Snowden wrote, “Humoralism was in retreat as doctors absorbed ideas about the circulatory and nervous systems, and as the chemical revolution and the periodic table undermined Aristotelian notions of the four elements composing the cosmos…(ushering in) more change in the decades since the French Revolution than in all the centuries between the birth of Socrates and the seizure of the Bastille combined.”
During the 20th century, but especially in the second half of the 1800’s, scientific progress was self-propagating. Much of the credit for this enlightened progress goes to the rapid evolution of “the germ theory.” No single genius was responsible. Rather, knowledge built step wise and involved a collective (if not fully cooperative) effort by multiple scientists.
In 1847, a remarkable bit of insight came thanks to the dogged experimentation of an Hungarian gynecologist who put 2 and 2 together. Maternal mortality from “puerperal fever” was commonplace at Vienna General Hospital at the time. But Ignaz Philipp Semmelweis, the physician director of the service, noticed that mortality rates between his two services – one run by physicians and the other midwives – was strikingly different. (20% vs. 2%). Delivery practices were roughly the same. But one difference stuck out. The physicians also were in charge of mandatory autopsies, and often shuttled between the delivery suite and morgue without a change in clothing or any cleansing whatsoever. Also one of Semmelweis’s trainees who cut himself during an autopsy also died of an identical disease. So he decided to require that members of both teams wash their hands in a chlorine solution before entering the maternity area. Mortality rapidly dropped to 1.3% overall.
Others were making a similar connection as well. For example, few stories are as well known as the case of John Snow, the London public health pioneer, who in 1854, traced the death of 500 victims during a cholera epidemic to a contaminated Broad Street water pump. Removing the pump handle was curative. As important as the investigative findings were his publication the following year titled “On The Mode of Communication of Cholera” because scientific progress relies heavily on transfer of knowledge and sequential collaboration. That same year, an Italian physician, Fillippo Pacini, first visualized the organism, C. vibrio. But 3 more decades would pass before the German physician, Robert Koch, would isolate the bacterium in pure culture.
In Session i, we were introduced to the “The Famous Trio” credited by historian Frank Snowden with creating “a wholesale revolution in medical philosophy.” They were:
I. Louis Pasteur (1822-1895) was a French chemist (not a biologist or physician) who famously stated that “Where observation is concerned, chance favors only the prepared mind.” In the 1850s, while investigating spoilage of wine and milk, he developed the theory that putrefaction and fermentation were not “spontaneous”, but the result of bacterial processes, and that these degrading actions could be altered by heat. Heat destroyed the bacteria, and prevented spoilage (ie. pasteurization). This insight launched the field of “microbiology.” Pasteur also helped define the principles of “nonrecurrence” (later called “acquired immunity”), and “attenuation”, a technique to render microbe causing disease specimens harmless when introduced as vaccines.
II. Robert Koch (1843-1910) was a German physician 20 years younger than Pasteur. He investigated Anthrax in the mid-1870’s at the University of Gottingen. Luckily the bacterium was very large and visible with the microscopes of the day. He was able to reproduce the disease in virgin animals by injecting blood from infected sheep. But, in addition, he detected and described spores formed by the bacterium, which were left in fields where infected animals grazed. The spores, he declared were why virgin animals grazing in these fields, became infected.
Teaming up with the Carl Zeiss optical company, Koch focused on technology, improving magnification lens, developing specialized specimen stains, and developing fixed culture media to grow microbes outside of animals. Armed with new tools and stains, he discovered the Mycobacteria tuberculosis and proved its presence in infected tissues, and described his findings in “The Etiology of Tuberculosis” in 1882.
“Koch’s Postulates” became the 4 accepted steps constituting scientific proof of a theory.
1) The microorganism must be found in infected tissue.
2) The organism must be grown in fixed culture.
3) The grown organism must instigate infection in a healthy laboratory animal.
4) The infective organism must be re-isolated from the lab animal, and proven identical to the original microbes.
III. Joseph Lister (1827-1912), the third of our trio, was a professor of surgery at Edinburgh. At the time, major complications of surgery were pain, blood loss and deadly post-operative infections. In the 1840s, ether and nitrous oxide were introduced, controlling intra-operative pain. As for infection, Lister suggested scrubbing hands before surgery, sterilizing tools, and spraying carbolic acid into the air and on the patient’s wound. Koch took an alternate approach, advocating sterile operating theories and surgeons in gowns, gloves, and masks. The opposing techniques merged and became common surgical practice in the 1890s.
Improved surgical instruments and technique also began to show progress in controlling hemorrhage during surgery, which translated into better survival rates and outcomes. But none of this would have been possible without the discovery of anesthesia. There were multiple experimenters converging at the approach to the second half of the 19th century. But the discovery of Diethyl Ether on March 20, 1842, by Georgia physician Crawford W. Long is the name most choose to remember. Two years later Nitrous Oxide appeared on the scene, followed by Chloroform three years later in 1847. The word anesthesia comes from the Greek word, (ἀναισθησία (anaisthēsíā) – meaning “without sensation.”
Hand in hand with the evolution of germ theory, as we’ve seen, came a growing appreciation for cleanliness and enlightened societal engineering. At the turn of the century, these movements couldn’t come soon enough. America was still recovering from the Civil War. Of the 620,000 military deaths, 2/3 were from disease. At the top of the list was dysentery and diarrheal disease, followed by malaria, cholera, typhus, smallpox, typhoid and others.
Filth, disease, and disorder ruled the day, especially in large urban sites like Chicago and New York. But opportunity lay in the wings. As historians described the challenge of those days: “New York City experienced a pivotal moment in its development following the historic 1898 consolidation, which united Manhattan, Brooklyn, Queens, The Bronx, and Staten Island into one comprehensive entity. By the year 1900, the city’s population had surged to 3,437,202 according to the U.S. Census.” Keeping those millions healthy would be a public health challenge of the first order.
Florence Nightingale had helped ignite the Sanitation movement a half century earlier, but there was still a great deal of work to be done. Order had always been part of her life.
In 1855, she and the 38 nurses in her charge had arrived in the middle of The Crimean War at Barrack Hospital at Scutari, outside modern day Istanbul, ill prepared for the disaster that awaited them. Cholera, dysentery and frostbite – rather than battle wounds – were rampant in the cold, damp, and filthy halls.
During that first winter, 42% of her patients perished, leaving over 4000 dead, the vast majority absent any battle wounds. Florence later described her work setting as “slaughter houses.” Their enemy wasn’t bullets or bayonets, but disease -typhus, cholera and typhoid fever. Over 16,000 British soldiers died, 13,000 from disease.
Nightingale’s initial assessment was that warmer clothing and food would stem the tide. Going over the heads of medical leadership, she did what she could, and leaked details of what she was observing to journalists who, for the first time, were stationed within the war zone. In the process, she became a celebrity in her own right, and as Spring of 1855 approached, a first ever Sanitary Commission was sent to the war zone and Victoria and Albert themselves, with two of their children, visited the area.
Famed artist Jerry Barrett made hasty sketches of what would become The Mission of Mercy: Florence Nightingale, which hangs to this day in the National Gallery in London. During this same period, the first lines of Henry Wadsworth Longfellow’s poem, Santa Filomena, would take shape, including “Lo! In that house of misery, A lady with a lamp I see, Pass through the glimmering of gloom, And flit from room to room.” And the legend of Nightingale, the actual “Lady of the Lamp”, appeared on the front page of the Illustrated London News complete with etched images. It read: “She is a ministering angel without any exaggeration in these hospitals, and as her slender form glides gently along each corridor, every poor fellow’s face softens with gratitude at the sight of her.”
What became clear to Florence and others was that infection and lack of sanitation were the culprits, and corrective actions on the facilities themselves, along with sanitary practices that Nightingale led, caused the subsequent mortality rates to drop to 2%. By the time the war drew to a halt in February, 1856, 900,000 men had died. Florence Nightingale remained for four more months, arriving home without fanfare on July 15, 1856. Thanks to the first ever war correspondents, she was now the 2nd most famous woman in Britain, after Queen Victoria.
While she was away, her patrons had raised money from rich friends. The Nightingale Fund now had 44,000 pounds in reserve. These would help fund a hospital training school and her famous book, “Notes on Nursing” in future years. Though her re-entry was quiet and reserved, she had plenty to say, and committed most of it to writing. While in Crimea, she had written to Britains top statistician, Dr. William Farr. He had replied, “Dear Miss Nightingale. I have read with much profit your admirable observations. It is like light shining in a dark place. You must when you have completed your task – give some preliminary explanation – for the sake of the ignorant reader.”
Shortly after her return, they met. Working closely with William Farr, she documented in dramatic form, the deadly toll in Crimea and tied it to disease and lack of sanitation in “Notes Affecting the Health, Efficiency, and Hospital Administration of the British Army”, which she self-published and aggressively distributed. Illustrated with spin wheel designs divided into 12 sectors, each one representing a month, she graphically tied improved sanitation to plummeting death rates. Understanding their long-term value, she carefully approved the paper, ink, and process that have allowed these images to remain vibrant a century and a half later. As she said later with some cynicism, they were “designed ‘to affect thro’ the Eyes what we may fail to convey to the brains of the public through their word-proof ears.” In 1858, she became the first woman to be made a fellow of the Royal Statistical Society.
In that first year of her return, she was described as “a one woman pressure group and think tank…using statistics to understand how the world worked was to understand the mind of God.”In 1860, she published her “Notes for Nursing,” selling 15,000 copies in first two months. It purposefully championed sanitation (“the proper use of fresh air, light, warmth, cleanliness, quiet, and the proper selection and administration of diet”) and promoted cleanliness as a path to godliness. It targeted “everywoman” while launching professional nursing.
But Florence Nightengale was not the originator of the Sanitary Movement in “Dirty Old London.” That honor goes to one Edwin Chadwick, a barrister who, in 1848, published his “Report on the Sanitary Condition of the Laboring Population of Great Britain.” As historians and commentators have noted, there was a good case to be made for cleanliness. Here are a few of those remarks:
“The social conditions that Chadwick laid bare mapped perfectly onto the geography of epidemic disease…filth caused poverty, and not the reverse.”
“Filthy living conditions and ill health demoralized workers, leading them to seek solace and escape at the alehouse. There they spent their wages, neglected their families, abandoned church, and descended into lives of recklessness, improvidence, and vice.”
“The results were poverty and social tensions…Cleanliness, therefore, exercised a civilizing, even Christianizing, function.”
Chadwick was born on January 24, 1800. His mother died when he was an infant, and his father was a liberal politician and educator. His grandfather was a close confidant of Methodist theologian, John Wesley. Early in his life, young Chadwick pursued a career of his own in law and social reform. A skilled writer, one of his early essays that appeared in the Westminister Review was titled “Applied Science and its place in Democracy.” By the time he was 32, he focused all of expertise on social engineering – especially public health with an emphasis on sanitation
But the Sanitary Movement required population-wide participation, structural change, new technology, and effective story telling. By then, Sir William Harvey’s description of the human circulatory system in 1628, complete with pump, outlet channels, and return venous channels, was well understood by most. Seizing on the analogy, Chadwick, along with city engineers of the day, imagined an underground highway of pipes, to and from every building and home, whose branches, connected to new sanitary devices.
One of those engineers was George Jennings, the owner of South Western Pottery in Dorset, the maker of water closets and sanitary pipes. He was something of a legend in his own time, and recipient of the 1947 Medal of the Society of Arts presented by none other than Prince Albert himself.
When Jennings patented the first toilet, as a replacement for soil pots, hand transported each mornings and emptied in an outhouse if you were lucky, or in the streets if not, its’ future was anything but assured. But, by good luck, the Great Exhibition (the premier display of futuristic visionaries) was scheduled for London in 1851. The Crystal Palace exhibition was the show stopper, attracting a wide range of imagineers.
Jennings “Monkey Closet” was the hit of the show. His patent application followed the next year and read: “Patent dated 23 August 1852. JOSIAH GEORGE JENNINGS, of Great Charlotte – street, Blackfriars-road, brass founder. For improvements in water-closets, in traps and valves, and in pumps.”
Modernity had arrived. And none was more enthusiastic than Thomas Crapper, a then 15-year old dreamer. Within three decades he had nine toilet patents, including one for the U-bend, an improvement on the S-bend, and an 1880 Royal commission granted to Thomas Crapper & Co. to install thirty toilets(enclosed and outfitted with cedar wood seats in the newly purchased Norfolk county seat’s Sandringham House. His reputation lives on thanks to this diminutive term, used with the greatest guttural emphasis by the Scots – CRAP. The company is also still in existence, now selling luxury models of the original design.
Sanitary engineering combined with Nightingale’s emphasis on fastidious housekeeping, and cleanliness in hospitals, enforced by nurses with religious zeal, would change the world. And they did. But those same changes would take decades to reach crowded immigrant entry points in locations like New York City.
One historian described it this way, “As the City ascended from a small seaport to an international city in the 1899’s, it underwent severe growing pains. Filth, disease, and disorder ravaged the city to a degree that would horrify even the most jaded modern urban developer.”
One of the prime offenders was the noble work horse. By 1900, on the eve of wholesale arrival of motor cars, there were roughly 200,000 horses in New York City, carrying and transporting humans, and products of every size and shape, day and night along the warn down cobble stone narrow roads and alley ways.
It was a hard life for the horse, who’s lifespan on average was only 2 1/2 years. They were literally “worked to death.” In the 1800’s, 15,000 dead horses were carted away in a single year. Often, they were left to rot in the street because they were too heavy to transport. If they weren’t dying, the horses were producing manure – a startling 5 million pounds dumped on city streets each day.
As for human waste, sewer construction didn’t begin in New York until 1849, this in response to a major cholera outbreak. Clean water had arrived seven years earlier with the arrival Croton Aqueduct carrying water south from Westchester County. This was augmented with rooftop water tanks beginning in 1880. By 1902, most of the city had sewage service including the majority of the tenement houses. The Tenement Act of 1901 had required that each unit have at least one “water closet.”
As for the horses, the arrival of automobiles almost eliminated the “horse problem” overnight. Not so for cows, or more specifically the disease laden “swill milk” cows. Suppliers north of the city struggled to keep up with demand in the late 1800’s. To lower production costs, they fed their cows the cast off “swill” of local alcohol distilleries. This led to infections and a range of diseases in the bargain basement beverage sold primarily to at-risk parents and consumed by children.
Swill milk was the chief culprit in soaring infant mortality in New York City between 1880 and 2000. Annually there were some 150,000 cases of diphtheria, resulting in 15,000 deaths a year. A Swiss scientist, Edwin Klebs, identified the causative bacteria, Corynebacterium diphtheriae, in 1883. A decade later, a German scientist, Emil von Behring, dubbed the “Saviour of Children” developed an anti-toxin to diphtheria and was awarded the Nobel Prize in 1901for the achievement.
The casualties were primarily infants whose mortality rate in NYC at the time was 240 deaths per 1000 live births. Many of these would be traced back to milk infected with TB, typhoid, Strep induced Scarlet Fever and Diphtheria.
The process of heating liquid to purify it, or pasteurization, was discovered by Louis Pasteur in 1856, not with milk but with wine. Its’ use on a broad scale to purify milk first gained serious traction in 1878 after Harper’s Weekly published an expose’ on “Swill Milk.” But major producers and distributors resisted regulation until 1913 when a massive typhoid epidemic from infected milk killed thousands of New York infants. But diphtheria remained the most feared killer of infants.
As Paul DeKruif wrote in his 1926 book, The Microbe Hunters, “The wards of the hospitals for sick children were melancholy with a forlorn wailing; there were gurgling coughs foretelling suffocation; on the sad rows of narrow beds were white pillows framing small faces blue with the strangling grip of an unknown hand.”
One such victim was the only child of two physicians, Abraham and Mary Putnam Jacobi whose 7-year old son, Ernst, was claimed by the disease in 1883. Working with philanthropist, Nathan Strauss, the Jacobi’s established pasteurized milk stations in the city which coincided with a 70% decline in infant mortality from diphtheria, tuberculosis and a range of other infectious diseases. Boiling milk to 146 degrees Fahrenheit for 30 minutes now became the rule, and vaccination of school children became the law in most communities.
By 1902, the horses hero status was reclaimed as it became the source of diphtheria and tetanus anti-toxins. The bacteria were injected into the horses, and after a number of passes, serum collected from the horse was laden with protective anti-toxins, relatively safe for human use. In 1901 alone, New York City purchased and delivered 25,000 does of ant-toxin funded by the Red Cross and the Metropolitan Life Insurance Company.
But progress didn’t come without sacrifice. As historians have noted, “In 1901 13 children died of tetanus because their diptheria toxin was contaminated bt tetanus from the horses. The public outrage resulted in the passage of the “1902 Biologics Control Act giving the government its first regulation of vaccine and antitoxin production.” Of course this was 18 years late for Ernst Jacobi and his grieving parents.
Corporations and Industrialization were front and center during this period. Chief among them was Bell Labs, situated at 24th Street on the West Side Highway overlooking the Hudson River. It would grow into the world’s largest industrial research laboratory with over 300 highly skilled employees. This was the opportunity scientific engineers and entrepreneurs faced – enormous risk and enormous promise.
Of course, the “Roaring 20s” didn’t end well beginning with national and worldwide financial collapse in 1929. And yet, this set the stage for, arguably, the greatest outpouring of public health and welfare initiatives that the US would ever support – FR’s “New Deal.”
As scholars recount it, “The New Deal funded the construction of hospitals and improved water and sewer systems, while the Social Security Act allocated money for state and local health services, such as tuberculosis and venereal disease control, and maternal and child welfare. Professionalized public health and federal standards, influencing the long-term structure of U.S. public health.”
The sheer range of efforts from 1933 to 1939 was hard to comprehend. They included the Works Progress Administration, the Social security Act signaed on August 14, 1935; encouraging “Fireside Chats”; the Glass-Stegall Bill in 1935 to regulate banks; infrastructure efforts crowned by the Tennessee Valley Authority; food relief, housing, and a range of health services that began to focus the nation’s attention on holistic wellness.
The progress included creation and expansion of blood banks. Three decades had passed since Karl Landsteiner, a German emigrant, had discovered blood typing and its role in triggering deadly reactions with mismatched blood transfusions. In 1939, as a naturalized American, the scientist added the definition of Rh factor to blood typing. This knowledge was put to practice first at the Cook County Hospital in Chicago by Dr. Bernard Santus, another immigrant, from Budapest, he tackled the challenges of collection, matching, clotting, and storage of blood. In describing his plans to collect, save and share blood from one donor to another future recipient, his daughter Ruth remarked, “It’s like a bank.” And the term, Blood Bank was history.
This came just in time for World War II – but not without controversy. The first head of the Army’s Blood Bank was Dr. Charles Drew. He was an African American who had attended Medical School in Canada. A laboratory medicine expert, he was approached by the British in 1940 with an appeal to supply blood for their troops. He refined the technigue for segmenting and preserving blood plasma and donated 5500 unites of plasma to British soldiers. The effort was so successful that the US Army made him their first Blood Bank director in 1941. But when the Army realized he was collecting blood fro Black and White citizens, they outlawed the practice fearing that racist soldiers would not tolerate receiving “black blood.”
Ultimately, the Army agree to collect blood from Black donors but continued to segregate its collection, storage and use until 1948. Blood segregation continued in Arkansas until 1969, and in Louisiana until 1972. Charles Drew resigned his post in protest in 1942. A year later the New York City Red Cross Blood Bank was created. By then blood fractionation with separation of albumin and plasma was standard practice. In 1942, blood was still stored and transported in 400cc glass bottles, but in 1943 freeze-dried plastic bags became standard. Two decades later, blood was routinely warmed before transfusing, and a decade after that preservation extended to 42 days.
FDR’s impact on human health was enormous. Many historians feel his most important appointment in the public health arena was not a medical doctor, but an physicist named Vannevar Bush. He’s contributions earned him the cover of TIME magazine in 1942 with the cover line, “Meet the man who may win or lose the war.” Overseeing thousands of scientists throughout the war, he not only stood up the Blood Bank and its collection of 13 million pints of blood, but also a range of discoveries from Penicillin to standard mass vaccination programs. As for physicis, his team is credited with the first main frame computer (critical to decoding German messages), our new Radar systems, and ultimately the A-bomb that ended the war.
The war did bring major advances in public health, but it also lead to enormous increases in chronic diseases and expansion of hospitals and health professionals to meet the demands. On average, veterans on avarage had a lifespan 11 years shorter than non-veterans, were 5% more likely to be disabled, had twice the rate of lung cancer, were much more likely to suffer from alcohol and drug abuse, and collectively included over 1 million psychiatric casualities, man living permanently in overflowing psychiatric hospitals.
The country responded to meet the acute needs. This included a massive hospital building program with a target of 4.5 hospital beds per 1000 citizens and expansion of employer based health insurance to 70% of the population. Medical schaools expandecd their numbers by 500%. The National Institutes of Health was launched, ultimately focused on 27 disease entities with massive research funding. And the pharmaceutical industry expanded its research funding six-fold.
During this period, citizens developed a role as active consumers. Despite growing post-war evidence that cancer played a role in lung cancer and heart disease, many physicians continued to smoke cigarettes, as 80% of Army doctors had during the was. Ultimately, it was the investigative reporter, Ray Norr, not doctors that turned the tide with his 1952 Reader’s Digest expose’ provocatively titled, “Cancer by the Carton.” A decade later, fighting an organized Tobacco Industry push-back, Surgeon General Luther Terry announced that “Cigarettes peril health,” and promised remedial action. But success awaited the appeared of another Surgeon General, C.Everett Koop, who wrestled the Industry to the ground in 1983.
It was during these years of the Reagan Administration that it became crystal clear that much of the American Disease Burden was a derivative of our own cultural that supported prejudice, violence, gun-support, and poverty.
On June 5, 1981, the CDC’s signature weekly publications the Morbidity and Mortality Weekly Report, tried not to alarm the public with this simple page 2 report, ”
“Pneumocystis pneumonia in the United States is almost exclusively limited to severely immuno-suppressed patients. The occurrence of Pneumocystosis in these 5 previously healthy individuals without a clinically apparent underlying immunodeficiency is unusual. The fact that these patients were all homosexuals suggests an association between some aspect of a homosexual lifestyle or disease acquired through sexual contact and Pneumocystis pneumonia in this population.”
As health experts would later recounted, “When HIV arrived in the early 1980’s, it proved every stereotype about the manageability of infectious diseases false.” Bias was everywhere in President Reagan’s Administration once the scale of the epidemic became obvious. Evangelical Rev. Jerry Falwell termed HIV/AIDS “The wrath of God upon homosexuals.” Domestic Policy Advisor Gary Bauer declared “Anybody who has AIDS ought to die with it.” And with William F. Buckley and Pat Buchanan competing for air time, it got worse from there.
By May 31, 1987, 1.2 million were living with the uniformly fatal infectious diseas, and 800,000 US citizens had already died. That was the day that Ronald Reagan lent his support to a Public Health effort to cure HIV/AIDS. He finally stepped forward at the behest of his wife Nancy who had been actively lobbied by their famous friend, Elizabeth Taylor.
At the time, a much younger celebrity, 16-year old, Ryan White had become a poster-child for tolerance. Three years earlier, he had contracted HIV from a transion treatment for his Hemophilia. In repose a group of parents, children, and teachers had lobbied the school board to bar Ryan’s attendance at school. Decency and science ultimately played out, and Ryan returned to classes. One of his vocal supporters was devout Christian Surgeon General C. Everett Koop. With his active support, Congress passed the Ryan White Program which provided funding and care for children with HIV/AIDS.
Dr. Koop was determined to expand basic knowledge of startegies to aid prevention of HIV/AIDS which required discussing homosexuality and sexuality behavior that was strongly opposed within the Reagan Conservative leadership. Conspiring with Tony Fauci (who had his own day in the spotlight as he attempted to steer Donald Trump toward a thoughtful Public Health response to Covid three decades later), Koop went over his superiors heads and (without permission) secretly printed 107 million pamphlets that were delivered to a national mailhouse in 38 boxcars, and mailed them in bulk on May 26, 1988. Uber-Conservative Senator Jesse Helms (R/NC), who had been the main supporter of KOOP in a difficult confirmation battle, couldn’t believe his eyes when the pamphlet arrived in his home mailbox. Titled “Understanding AIDS,” it didn’t mince words. In the fallout, Koop appeared on Pat Robertson’s 700 Club Christian televison network, responding to his critics. To their faces, he said, “I’m the nation’s doctor, not the nation’s chaplain.”
Two years later, 18-year old Ryan White died of his disease. Koop, Fauci and leaders of the HIV/AIDS community then joined hands to expedite trials and releases of wide range of new therapies. These did not cure the disease, but for most, held them in check. Even as progress in infectious disease therapies were exploding, the nation’s foremost scientists at the Institute of Medicine (IOM) were warning that infectious organisms “outnumber us by a billion fold, and mutate a billion times more quickly.”
On May 31, 1996, Epidemiology expert, Dr. Michael Osterholm put the lack of public health preparedness this way, “I am here to bring you the sobering and unfortunate news that our ability to detect and monitor infectious disease threats to health in this country is in serious jeopardy…For 12 of the States or territories, there is no one who is responsible for food or water-borne surveillance. You could sink the Titanic in their back yard and they would not know they had water.”
Molecular Medicine experimenters went head-to-head with Public Health professionals who saw great dangers in risky live experiments. In 1997, a scientific expedition to Brevig Mission, Alaska, successful collected perafrost preserved tissue from victims of the 1918 Flu Epidemic. They ultimately were able to not only describle the viruses genome, but actually reawaken a live 1918 flu virus.
A group of scientific ethicists, termed the 2014 Cambridge Working Group, reporting on this type of live :Gain-of-Function” research stated at the time: “Accident risks with newly created ‘potential pandemic pathogens’ raise grave new concerns. Laboratory creation of highly transmissible, novel strains of dangerous viruses…poses substantially increased risks. An accidental infection in such a setting could trigger outbreaks that would be difficult or impossible to control.”
One year later, in Dr. Ralp Baric’s virology lab at the Unibersity of North Carolina, his young Fellow, Shi Zhengli, was near mastering the technique to create “Chimeric viruses.” She would soon return to her own lab in Wuhan, China. She was famous in her own right as a worldwide expert in bat viruses which she had collected and grown from field tours in major caves in China. Her funding for this type of “gain-of-Function” research had been outlawed for the time being in the US. But the Pentagon and Department of Defense favored creation of brand new, supere-virulent viruses to allow research on how to control such organisms if ever release by enemies as biologic weapons. The solution was a work-around non-profit called the EcoHealth Alliance headed by an aggressive biomedical entrepreneur, Peter Daszak. In 1018 Shi headed back to her lab with $100 million in US funding and assurances that her new lab buildings in Wuhan were extra safe and secure.
The rest is history. Though not conclusively and universally accepted, the growing consensus is that the virulent Covid virus was constructed and accidentally released fro Shi Zhengli’s Wuhan Lab in late 2019. Over the next two to three years, nearly 6 million died worldwide, including over a million Americans. At the same time President Trump sowed confusion everywhere he went, undermining his own Public Health leaders with vast quantities of disinformation.
As a result of the Covid Pandemic, the public was introduced to revolutionary vaccines created using messenger RNA (mRNA) technology. We will hear more about that in session III. In addition, the public was exposed to infectious diseases behaviors – why they spread and what can be done to control them. Emphasizing the importance of measurement (as we witnessed in Session I), the number we look at is affected by three functions: 1. Receptivity which is a function of immunity status, population density, and frailty. 2. Transmissability defined by Ro – that is how many people on average a single person will infect. 3. Virulence the confirmed % of fatality.
If we compare standard Flu, Covid-19, and Measles, their respective Ro’s are 1.5, 2.5, and 16. But their CFR’s or Confirmed Fatality Rates (CFRs) in % are: .1%, 2.1%, and 1.3%. If one factors in receptivity, the Measles epidemic morbidity in Texas centered on unvaccinated and frail individual the CFR for measles can go as high as 5% in a frail hospitalized population. Similarly, Covid-19 proved to be especially fatal in nursing home populations, and those with preexisting respiratory diseases. The controversial use of masking and isolation of populations focused on affecting Transmissability and isolating the most vulnerable populations.
Mobility of human populations in a modern world proved challenging. High speed travel, global warming driven disasters and migration, conflict, lack of scientific oversight and public health infrastucture, and lacking health care access all contributed to lives lost. In the battle between human and microbe, social, political, philosophical, medical, and ecological played a role. What we have learned is that wise governance is essential for health and wellness of populations.
In our final session, we will explore the interface of science, technology and human progress – a complex story of scientific progress, unexpected twists and turns, ethical dilemmas, and serious concerns over whether humans can maintain control over the technologic miracles they create.