Mike Magee MD

Printable Version

It is an impossible task to rank list the top ten medical discoveries that have improved our human health and welfare over the centuries. Nonetheless, I will provide a countdown list at the end of this presentation.

Taking on this challenge has been filled with twists and turns. As always, I choose to begin by providing some context for what lies ahead. In sharing this story of human development, you will meet a range of discoverers. They are artisans and dreamers and determined spirits. The discoveries they have made have no geographic bounds, crossing continents on their own time schedules and traveling well, but not without bumps in the road. In most cases, the discoveries have usually served as their own reward, and quite often have not been accompanied by fame and fortune, at least not immediately, or even in a single lifetime. Most of the discoveries have built sequentially, one on another, with refinements and new insights providing course corrections. Those that have survived, are timeless truths, unless overtaken by future insights and discoveries.

One of the original challenges has been how exactly to organize the material. In general, these medical discoveries – many more than ten – fall into one of three categories – Basic Science, Public Health, or Medical Technology. What all share in common are a persistent human struggle to see, to feel, to measure, and ultimately to understand. This urge to make sense of the world at times crosses over into the realm of science fiction – imaginary journeys into the land of “What if?”

In comic world, science and science fiction confidently blend and blur. In ranking the most powerful super heroes of all time, comic book aficionado Maverick Heart (aka aeromaxxx 777)  listed “Flash” as the clear winner, stating “Not only does he have super-speed, but once he reaches terminal velocity, he has shown other incredible powers. During an attempt to measure his top speed, he strained every muscle in his body to run at about 10 Roemers, which is 10 times the speed of light.”

Roemers?” That’s a reference to Ole Roemer (1644-1710), a Danish uber-scientist, whose seminal discovery of the “speed of light” was celebrated on his 340th anniversary in 2016 with a Google doodle . Details aside (the Earth’s timing of orbits around our sun were measured against Jupiter moon Io’s orbit around the distant planet), Ole’ detected a discrepancy in the measurements of the eclipses which amounted to 11 minutes. He attributed the lag to a speed of light he calculated to be 140,000 miles per second, not infinite as was commonly thought at the time.

Ole’ is a reasonable starting point for our story because he was mentor of choice for scientists of his day, and one of them was a 22-year old dreamer on the run named Daniel Gabriel Fahrenheit (1686-1736).The road that led to that meeting however was bumpy. At age 15, Fahrenheit  lost both his parents to an accidental mushroom poisoning. His guardian then arranged a four-year merchant trade apprenticeship in Amsterdam. But when he completed the program at the age of 20 in 1706, he escaped an agreed upon further commitment to the Dutch East India company and his guardians sought an arrest warrant. 

When he arrived at the now Mayor of Copenhagen’s  Ole Romer’s doorstep two years later, he was seeking business guidance and a pardon from further legal action. He achieved both. Ole Romer explained that there was currently intense interest in high quality instruments that could measure temperature. For nearly a century, everyone from Galileo to Huygens to Halley had been working on it. He himself had invented one in 1676 while convalescing from a broken leg, but there was great room for technical improvements.

The challenges were threefold – physical construction, the creation of a standard measurement scale, and reproducible accuracy. Fahrenheit took up the challenge and  spent the next four years refining his glass blowing skills, discovered that mercury was a more reliable reference liquid than alcohol, and realizing he could improve on Romer’s scale – which he did, renaming it the Fahrenheit scale in 1717 in a publication, Acta Editorum.

The famous scale was pegged on three different reference points. The first was the point at which a mixture of ice, water and salt reached equilibrium, which he identified as 0 degrees. The second was the temperature at which ice was just beginning to form on still water. This would be 32 degrees. And the final measure was the temperature when the thermometer was placed under the arm or in the mouth. This became 96 degrees. The span between 0 and 96 allowed Fahrenheit to create a dozen divisions with each subdivided into 8 parts. (12 X 8 = 96).

Just two decades later, in 1745, another scientist name  Anders Celsius arrived on the scene with a new scale. It would be a slow-burn, taking approximately two centuries to officially supersede the Fahrenheit scale everywhere in the world except (not surprisingly) the United States. Built on a scale of 0 to 100, the Celsius scale is also called the centigrade scale.

American scientific hubris has forced America’s math students to memorize conversion formulas and engage in what management guru, Tom Peters, would call unnecessary “non-real work.”

A down and dirty one: F -30 /2 = C.

Or more accurately: (F-32)/1.8 = C

What this opening tale illustrates however is that scientific discoverers have determined spirits, are often multi-talented tactile problem solvers, and dream big. Next case in point, William Harvey (1578-1657). Without modern tools, he deduced from inference rather than direct observation that blood was pumped by a four chamber heart through a “double circulation system” directed first to the lungs and back via a “closed system” and then out again to the brain and bodily organs. In 1628, he published all of the above in an epic volume, De Motu Cordis.

Far fewer know much about Stephen Hales, who in 1733, at the age of 56, is credited with discovering the concept of “blood pressure.” A century later, the German physiologist, Johannes Müller,  boldly proclaimed that Hales “discovery of the blood pressure was more important than the (Harvey) discovery of blood.”

Modern day cardiologists seem to agree. Back in 2014, the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure reported that “With every 20 mm Hg increase in systolic or 10 mm Hg increase in diastolic blood pressure, there is a doubling risk of mortality from both ischemic heart disease and stroke.”

But comparisons are toxic. No need to diminish Harvey who correctly estimated human blood volume (10 pints or 5 liters), the number of heart contractions, the amount of blood ejected with each beat, and the fact that blood was continuously recirculated – and did this all 400 years ago. But how to measure the function, and connect those measurements to an amazingly significant clinical condition like hypertension, is a remarkable tale that spanned two centuries and required international scientific cooperation.

Harvey was born in 1578 and died in 1657, twenty years before the birth of his fellow Englishman, Stephen Hales. Hales was a clergyman whose obsessive and intrusive fascination with probing the natural sciences drew sarcasm and criticism from the likes of classical scholar and sometimes friend, Thomas Twinning. He penned a memorable insult laced poem from the verdant English village of Teddington in Hales’ honor titled “The Boat of Hales.”

“Green Teddington’s serene retreat
For Philosophic studies meet,
Where the good Pastor Stephen Hales
Weighed moisture in a pair of scales,
To lingering death put Mares and Dogs,
And stripped the Skins from living Frogs,
Nature, he loved, her Works intent
To search or sometimes to torment.”

The torment line was well justified in light of Hales own 1733 account of his historic first ever mention of the measurement of arterial blood pressure, self-described here:

“In December I caused a mare to be tied down alive on her back; she was fourteen hands high, and about fourteen years of age; had a fistula of her withers, was neither very lean nor yet lusty; having laid open the left crural artery about three inches from her belly, I inserted into it a brass pipe whose bore was one sixth of an inch in diameter … I fixed a glass tube of nearly the same diameter which was nine feet in length: then untying the ligature of the artery, the blood rose in the tube 8 feet 3 inches perpendicular above the level of the left ventricle of the heart; … when it was at its full height it would rise and fall at and after each pulse 2, 3, or 4 inches.”

Two centuries later, Hales did have competition for “king of torment.” By then scientists for some time had been aware of the anatomy of the human heart, but it wasn’t until 1929 that they actually were able to see it in action. That was when a 24-year old German medical intern in training named Werner Forssmann came up with the idea of threading a ureteral catheter through a vein in his own  arm to his heart. 

His superiors refused permission for the experiment. But with junior accomplices, including an enamored nurse, and a radiologist in training, he secretly catheterized his own heart by opening an incision in his cubital vein and feeding the sterile catheter (intended for a ureter not a vessel) up into the heart. He then went bandaged to the Xray suite and injected dye into the vein, revealing for the first time a live 4-chamber heart beating heart.  Werner Forssmann’s “reckless action” was eventually rewarded with the 1956 Nobel Prize in Medicine.  Another two years would pass before the dynamic  Mason Sones, Cleveland Clinic’s director of cardiovascular disease, successfully (if inadvertently) imaged the coronary arteries themselves without inducing a heart attack in his 26-year old patient with rheumatic heart disease.

Artistry and bravado stand side by side in many of medicine’s most notable discoveries. But the route toward discovery is often surprising. Take the case of Antonie van Leeuwenhoek. He was a 22-year old Dutch shop manager selling linen, yarn and ribbon to seamstress and tailors in 1654. His hobby was toying with cut glass lenses and he constructed a small hand-held magnifying metal platform and was able to enlarge objects 275 times their normal size. This was useful to him because it allowed him to assess the stitch quality of the linens he was purchasing for resale. Curious, he also investigated natural materials like ash tree sprigs , and was surprised to see microscopic swimming creatures in pond water which he called “animalcules.”

Around the same time, Britain’s Robert Hooke peered through his primitive microscope at a slice of cork and described the little boxes he saw as “cellular” – resembling the tiny  rooms (cellula) that monks inhabited. When he published that impression in Micrographiathat year, the label “cell” was born.

Curious, he also investigated natural materials like ash tree sprigs but also pond water. In 1676, at the age of 44, he was surprised to see microscopic swimming creatures in the water which he called “animalcules.”

Over the years that followed, the design of microscopes evolved and all manners of natural material were described. But 1838 was a turning point. That was the year that Germany’s Botanist Matthias Schelden teamed up with Zoologist Theodor Schwann. They had noted that both animal and plant species were constructed of cells. In reflecting on cellular life, they declared three principles – the first two proved correct, the third an error. 

The principles were: 

  1. A single cell is the basic unit of every organism.
  2. All organisms are made up of one or more cells.
  3. New cells arise spontaneously.

It wasn’t until seventeen years later that the famed German physician, Rudolph Virchow corrected #3. He was known by his students for encouraging them to embrace “attention to detail” by “thinking microscopically.”  But Virchow is remembered primarily for a famous phrase – “omnis cellula e cellula” (every cell stems from another cell). That may not sound too radical today. But back in 1855, it was revolutionary.

This principle endured and led Virchow to more than one break through discovery. With this insight, Virchow launched the field of cellular pathology. How exactly cells manage to divide and create identical copies of themselves remained to be determined. But he did figure out, before nearly all other scientists, that diseases must involve distortions or changes in cells. From this he deduced that diagnosis and ultimately treatment could now be guided not simply by symptoms or external findings, but also by cellular pathologic changes. And this was more than theory. In fact, Virchow is credited with first describing the microscopic findings of leukemia way back in 1847.

Virchow is also remembered as the preeminent medical educator of his day who changed the course of countless American medical leaders who traveled to Germany for instruction. His most notable mark was on the newly launched Johns Hopkins School of Medicine in Baltimore. It was founded in 1893 by four founders termed the “Big Four” . These included William Welch (pathologist/public health), William Osler (internal medicine), William Stewart Halsted (surgery), and Howard Atwood Kelly (obstetrics). Welch served as the first Dean of the school and Osler, born and bred in Canada, arrived at Johns Hopkins at the age of 40, and birthed the first residency training program at the school. Prior to their arrival in Baltimore, the two shared a common medical origin story. They both were trained in pathology and cell theory by Virchow.

They were not starting from scratch. The first description of a cell nucleus was made by a Scottish botanist, Robert Brown in 1833.  But over the next half-century, cell scientists busily described various cell organelles without a clear understanding of what they did. What was clear with light microscopy was that cells were bounded by a cellular membrane. 

Prior to the launching of Johns Hopkins, most of the attention in the second half of the 19th century was on the nucleus and its division and cell replication. In 1874, German biologist, Walther Flemming, first described mitosis in detail. But it wasn’t until 1924 that German biologist Robert Feulgen, “following experiments using basic stains such as haematoxylin, defined chromatin as ‘the substance in the cell nucleus which takes up color during nuclear staining’”. To this day, the Feulgen reaction “still exerts an important influence in current histochemical studies.”

Watson & Crick’s description of the DNA double helix was still far in the distance. But in the mean time, other cell organelles were visualized and named like the Golgi apparatus named in 1898 after Italian biologist Camillo Golgi who also used heavy metal stains (silver nitrate or osmium tetroxide) to aid visualization. Mitochondria, like the Golgi apparatus, stretched the limits of light microscopic visualization. But even without visualization, scientists by the 1930’s were beginning to deduce the functions of organelles they could barely see, and a few (like lysosomes) that they had never seen but knew had to be there.

Two discoveries in the first half of the twentieth century served as accelerants in intracellular science. The first was the electron microscope . Its’ discovery is generally credited to two German PhD students, Max Knoll and Ernst Ruska, who used two magnetic lenses to generate a beam of electrons and achieve much higher magnifications in 1931. For their efforts they received the 1986 Nobel Prize in Physics. Breakthroughs began to roll out almost immediately. High resolution pictures of mitochondria appeared in 1952, followed by the Golgi apparatus in 1954. The inner workings of the cells displayed movement of vesicles across the membrane and from nucleus to cytoplasm, with structures constantly being constructed and deconstructed. And visualization only got better in 1965 when the first scanning electron microscope went commercial.

The second accelerant was the discovery of penicillin which allowed for in vitro tissue cultures to survive and thrive. Attempts to grow living cells in Petri dishes date back more that a century. One critical step along the way occurred in 1882 when a British physician named Sydney Ringer  came up with the formula for Ringer’s Lactate  (a substance that included sodium chloride, potassium chloride, calcium chloride and sodium bicarbonate) while experimenting on how to keep a frog heart, removed from the frog, beating while suspended in solution. 

Three years later, a budding German embryologist, Wilhelm Roux, who was fixated on “showing Darwinian processes on a cellular level,” was able to keep cells he had extracted from a chicken embryo alive for 13 days. With that, the discipline called cell culture or tissue culture was off and running. But the effective use of the technique of tissue culture to preserve, display and grow living cells over the next few decades was frustrated by the nagging issue of bacterial contamination.

The discovery of penicillin by Alexander Fleming in 1928 solved that problem, but also introduced ethical concerns from the future of stem cell lines, to the non-recognition or compensation of Henrietta Lacks who on her death from cervical cancer, became the uninformed donor of the still existent HeLa experimental cell line. Naturally or scientifically mutated cells, can grow and continue to divide indefinitely, creating an immortalized cell line. And on this rock was also built the promise of In Vitro Fertilization (IVF), but also genetic manipulation of human embryos filled with ethical minefields.

Of course, germs have been a source of concern to humans long before the invention of tissue cultures or the discovery of penicillin. Smallpox for example existed and was prominently immortalized in the scarred mummified face of Ancient Egyptian Pharaoh Ramses V. During prominent epidemics, the highly contagious affliction is thought to have claimed up to 10% of the existing human population. Its’ threat was significant enough to draw the attention of Lady Mary Wortley Montague, wife of British Ambassador to Constantinople (Istanbul). In 1716 she witnessed local Turkish sects inoculating groups of congregants with smallpox. She had her small son inoculated prior to their return to Britain. Once there, she arranged for her daughter to also be inoculated, this time before the King’s private physician. Both children were safely protected from future contagion. And yet, resistance was strongly opposed to the practice. 

The same was the case in the American colonies. In 1716, Benjamin Franklin refused the smallpox inoculation of his 4 year old son who subsequently died from the infection. The early colonists had blamed the scourge on Native Americans after 20 Mayflower settlers died of the infection.  By 1721, the religious leader from Massachusetts, Cotton Mather, was promoting what he called variation, using the smallpox scrappings from his personal slave, Onesimus. At the time there were already 800 deaths that year from the current epidemic and a 14% mortality rate. Mather’s experiment showed the positive effect of the procedure that drew down mortality to 2.5%. But it so incensed the locals that they stoned his house and posted the famous message “A Pox to You!” But George Washington embraced the practice. His mandatory inoculation of the Continental Army in 1777 is viewed as contributing to his military victory over the non-inoculated British soldiers.

A few years later in 1796, British physician, Edward Jenner, went one step further. He had noticed that the milkmaids on his farm had developed Cow Pox pustules on their hands from contact with infected Cow utters. But their faces and bodies showed no lesions. He surmised that Cow Pox and Small Pox were related, but that the former was less virulent to humans. So he surmised that exposure to Cow Pox might build a natural resistance to the worse infection. So he convince his gardener to offer up his 8-year old son for an experiment. First he inoculated him with Cow Pox, to which he had a mild reaction. Later he inoculated him with Smallpox and he did not contract a virulent infection. This, and his willingness to inoculate his own 11-month old child, earned him the title of “Father of Immunology.”

But progress was slow. When Yellow Fever broke out shortly after the arrival of a trading ship from Saint-Dominque in Philadelphia among colonists with no immunity in 1793, the main response was panic, fear, and mass evacuation from the city.  Five thousand  citizens, roughly 10% of the population, including Alexander Hamilton and his wife, fled. Experts were at a loss to explain the cause, and were even more confused how to treat the disease. 

Benjamin Rush was the primary leader of Medicine in the young nation where physicians were scarce and nearly all relied on Europe for their advanced education. In confronting the epidemic, Rush believed the problem involved a disturbance of the four humors – blood, phlegm, black bile and yellow bile. His solution was the rather liberal and barbaric use of cathartics and blood letting.  For the most severe cases, Rush warehoused dying victims in the first “fever hospital” in the new nation.

But a century later, as historian Frank Snowden wrote, “Humoralism was in retreat as doctors absorbed ideas about the circulatory and nervous systems, and as the chemical revolution and the periodic table undermined Aristotelian notions of the four elements composing the cosmos…(ushering in) more change in the decades since the French Revolution than in all the centuries between the birth of Socrates and the seizure of the Bastille combined.”

During the 20th century, but especially in the second half of the 1800’s, scientific progress was self-propagating. Much of the credit for this enlightened progress goes to the rapid evolution of  “the germ theory.” No single genius was responsible. Rather, knowledge built step wise and involved a collective (if not fully cooperative) effort by multiple scientists.

In 1847, a remarkable bit of insight came thanks to the dogged experimentation of an Hungarian gynecologist who put 2 and 2 together. Maternal mortality from “puerperal fever” was commonplace at Vienna General Hospital at the time. But Ignaz Philipp Semmelweis, the physician director of the service, noticed that mortality rates between his two services – one run by physicians and the other midwives – was strikingly different. (20% vs. 2%). Delivery practices were roughly the same. But one difference stuck out. The physicians also were in charge of mandatory autopsies, and often shuttled between the delivery suite and morgue without a change in clothing or any cleansing whatsoever. Also one of Semmelweis’s trainees who cut himself during an autopsy also died of an identical disease. So he decided to require that members of both teams wash their hands in a chlorine solution before entering the maternity area. Mortality rapidly dropped to 1.3% overall.

Others were making a similar connection as well. For example, few stories are as well known as the case of John Snow, the London public health pioneer, who in 1854, traced the death of 500 victims during a cholera epidemic to a contaminated Broad Street water pump. Removing the pump handle was curative. As important as the investigative findings were his publication the following year titled “On The Mode of Communication of Cholera” because scientific progress relies heavily on transfer of knowledge and sequential collaboration. That same year, an Italian physician, Fillippo Pacini, first visualized the organism, C. vibrio. But 3 more decades would pass before the German physician, Robert Koch, would isolate the bacterium in pure culture.

Koch was one of “The Famous Trio,” credited by historian Frank Snowden with creating “a wholesale revolution in medical philosophy.” In addition to Koch, these included French scientist Louis Pasteur, and Scottish surgeon, Joseph Lister.

I. Louis Pasteur (1822-1895) was a French chemist (not a biologist or physician) who famously stated that “Where observation is concerned, chance favors only the prepared mind.” In the 1850s, while investigating spoilage of wine and milk, he developed the theory that putrefaction and fermentation were not “spontaneous”, but the result of bacterial processes, and that these degrading actions could be altered by heat. Heat destroyed the bacteria, and prevented spoilage (ie. pasteurization). This insight launched the field of “microbiology.” Pasteur also helped define the principles of “nonrecurrence” (later called “acquired immunity”), and “attenuation”, a technique to render microbe causing disease specimens harmless when introduced as vaccines.

II. Robert Koch (1843-1910) was a German physician 20 years younger than Pasteur. He investigated Anthrax in the mid-1870’s at the University of Gottingen. Luckily the bacterium was very large and visible with the microscopes of the day. He was able to reproduce the disease in virgin animals by injecting blood from infected sheep. But, in addition, he detected and described spores formed by the bacterium, which were left in fields where infected animals grazed. The spores, he declared were why virgin animals grazing in these fields, became infected.

Teaming up with the Carl Zeiss optical company, Koch focused on technology, improving magnification lens, developing specialized specimen stains, and developing fixed culture media to grow microbes outside of animals. Armed with new tools and stains, he discovered the Mycobacteria tuberculosis and proved its presence in infected tissues, and described his findings in “The Etiology of Tuberculosis” in 1882.

“Koch’s Postulates” became the 4 accepted steps constituting scientific proof of a theory.

1) The microorganism must be found in infected tissue.

2) The organism must be grown in fixed culture.

3) The grown organism must instigate infection in a healthy laboratory animal.

4) The infective organism must be re-isolated from the lab animal, and proven identical to the original microbes.

III. Joseph Lister (1827-1912), the third of our trio, was a professor of surgery at Edinburgh. At the time, major complications of surgery were pain, blood loss and deadly post-operative infections. In the 1840s, ether and nitrous oxide were introduced, controlling intra-operative pain. As for infection, Lister suggested scrubbing hands before surgery, sterilizing tools, and spraying carbolic acid into the air and on the patient’s wound. Koch took an alternate approach, advocating sterile operating theories and surgeons in gowns, gloves, and masks. The opposing techniques merged and became common surgical practice in the 1890s.

Improved surgical instruments and technique also began to show progress in controlling hemorrhage during surgery, which translated into better survival rates and outcomes. But none of this would have been possible without the discovery of anesthesia. There were multiple experimenters converging at the approach to the second half of the 19th century. But the discovery of Diethyl Ether on March 20, 1842, by Georgia physician Crawford W. Long is the name most choose to remember. Two years later Nitrous Oxide appeared on the scene, followed by Chloroform three years later in 1847. The word anesthesia comes from the Greek word,  (ἀναισθησία (anaisthēsíā)  –  meaning “without sensation.”

Hand in hand with the evolution of germ theory, as we’ve seen, came a growing appreciation for cleanliness and enlightened societal engineering. At the turn of the century, these movements couldn’t come soon enough. America was still recovering from the Civil War. Of the 620,000 military deaths, 2/3 were from disease. At the top of the list was dysentery and diarrheal disease, followed by malaria, cholera, typhus, smallpox, typhoid and others.

Filth, disease, and disorder ruled the day, especially in large urban sites like Chicago and New York. But opportunity lay in the wings. As historians described the challenge of those days:  “New York City experienced a pivotal moment in its development following the historic 1898 consolidation, which united Manhattan, Brooklyn, Queens, The Bronx, and Staten Island into one comprehensive entity. By the year 1900, the city’s population had surged to 3,437,202 according to the U.S. Census.” Keeping those millions healthy would be a public health challenge of the first order.

Florence Nightingale had helped ignite the Sanitation movement a half century earlier, but there was still a great deal of work to be done. Order had always been part of her life.

In 1855, she and the 38 nurses in her charge had arrived in the middle of The Crimean War at Barrack Hospital at Scutari, outside modern day Istanbul, ill prepared for the disaster that awaited them. Cholera, dysentery and frostbite – rather than battle wounds – were rampant in the cold, damp, and filthy halls.

During that first winter, 42% of her patients perished, leaving over 4000 dead, the vast majority absent any battle wounds. Florence later described her work setting as “slaughter houses.” Their enemy wasn’t bullets or bayonets, but disease -typhus, cholera and typhoid fever. Over 16,000 British soldiers died, 13,000 from disease.

Nightingale’s initial assessment was that warmer clothing and food would stem the tide. Going over the heads of medical leadership, she did what she could, and leaked details of what she was observing to journalists who, for the first time, were stationed within the war zone. In the process, she became a celebrity in her own right, and as Spring of 1855 approached, a first ever Sanitary Commission was sent to the war zone and Victoria and Albert themselves, with two of their children, visited the area.

Famed artist Jerry Barrett made hasty sketches of what would become The Mission of Mercy: Florence Nightingale, which hangs to this day in the National Gallery in London. During this same period, the first lines of Henry Wadsworth Longfellow’s poem, Santa Filomena, would take shape, including “Lo! In that house of misery, A lady with a lamp I see, Pass through the glimmering of gloom, And flit from room to room.” And the legend of Nightingale, the actual “Lady of the Lamp”, appeared on the front page of the Illustrated London News complete with etched images. It read: “She is a ministering angel without any exaggeration in these hospitals, and as her slender form glides gently along each corridor, every poor fellow’s face softens with gratitude at the sight of her.”

What became clear to Florence and others was that infection and lack of sanitation were the culprits, and corrective actions on the facilities themselves, along with sanitary practices that Nightingale led, caused the subsequent mortality rates to drop to 2%. By the time the war drew to a halt in February, 1856, 900,000 men had died. Florence Nightingale remained for four more months, arriving home without fanfare on July 15, 1856.  Thanks to the first ever war correspondents, she was now the 2nd most famous woman in Britain, after Queen Victoria.

While she was away, her patrons had raised money from rich friends. The Nightingale Fund now had 44,000 pounds in reserve. These would help fund a hospital training school and her famous book, “Notes on Nursing” in future years. Though her re-entry was quiet and reserved, she had plenty to say, and committed most of it to writing. While in Crimea, she had written to Britains top statistician, Dr. William Farr. He had replied, “Dear Miss Nightingale. I have read with much profit your admirable observations. It is like light shining in a dark place. You must when you have completed your task – give some preliminary explanation – for the sake of the ignorant reader.” 

Shortly after her return, they met. Working closely with William Farr, she documented in dramatic form, the deadly toll in Crimea and tied it to disease and lack of sanitation in “Notes Affecting the Health, Efficiency, and Hospital Administration of the British Army”which she self-published and aggressively distributed. Illustrated with spin wheel designs divided into 12 sectors, each one representing a month, she graphically tied improved sanitation to plummeting death rates.  Understanding their long-term value, she carefully approved the paper, ink, and process that have allowed these images to remain vibrant a century and a half later. As she said later with some cynicism, they were “designed ‘to affect thro’ the Eyes what we may fail to convey to the brains of the public through their word-proof ears.” In 1858, she became the first woman to be made a fellow of the Royal Statistical Society.

In that first year of her return, she was described as “a one woman pressure group and think tank…using statistics to understand how the world worked was to understand the mind of God.”In 1860, she published her “Notes for Nursing,” selling 15,000 copies in first two months.  It purposefully championed sanitation (“the proper use of fresh air, light, warmth, cleanliness, quiet, and the proper selection and administration of diet”) and promoted cleanliness as a path to godliness. It targeted “everywoman” while launching professional nursing.

But Florence Nightengale was not the originator of the Sanitary Movement in “Dirty Old London.” That honor goes to one Edwin Chadwick, a barrister who, in 1848, published his “Report on the Sanitary Condition of the Laboring Population of Great Britain.” As historians and commentators have noted, there was a good case to be made for cleanliness. Here are a few of those remarks:

“The social conditions that Chadwick laid bare mapped perfectly onto the geography of epidemic disease…filth caused poverty, and not the reverse.”

“Filthy living conditions and ill health demoralized workers, leading them to seek solace and escape at the alehouse. There they spent their wages, neglected their families, abandoned church, and descended into lives of recklessness, improvidence, and vice.”

“The results were poverty and social tensions…Cleanliness, therefore, exercised a civilizing, even Christianizing, function.”

Chadwick was born on January 24, 1800. His mother died when he was an infant, and his father was a liberal politician and educator. His grandfather was a close confidant of Methodist theologian, John Wesley. Early in his life, young Chadwick pursued a career of his own in law and social reform. A skilled writer, one of his early essays that appeared in the Westminister Review was titled “Applied Science and its place in Democracy.” By the time he was 32, he focused all of expertise on social engineering – especially public health with an emphasis on sanitation

But the Sanitary Movement required population-wide participation, structural change, new technology, and effective story telling. By then, Sir William Harvey’s description of the human circulatory system in 1628, complete with pump, outlet channels, and return venous channels, was well understood by most. Seizing on the analogy, Chadwick, along with city engineers of the day, imagined an underground highway of pipes, to and from every building and home, whose branches, connected to new sanitary devices.

One of those engineers was George Jennings, the owner of South Western Pottery in Dorset, the maker of water closets and sanitary pipes. He was something of a legend in his own time, and recipient of the 1947 Medal of the Society of Arts  presented by none other than Prince Albert himself.

When Jennings patented the first toilet, as a replacement for soil pots, hand transported each mornings and emptied in an outhouse if you were lucky, or in the streets if not, its’ future was anything but assured. But, by good luck, the Great Exhibition (the premier display of futuristic visionaries) was scheduled for London in 1851. The Crystal Palace exhibition was the show stopper, attracting a wide range of imagineers. 

Jennings “Monkey Closet” was the hit of the show. His patent application followed the next year and read: “Patent dated 23 August 1852. JOSIAH GEORGE JENNINGS, of Great Charlotte – street, Blackfriars-road, brass founder. For improvements in water-closets, in traps and valves, and in pumps.”

Modernity had arrived. And none was more enthusiastic than Thomas Crapper, a then 15-year old dreamer. Within three decades he had nine toilet patents, including one for the U-bend, an improvement on the S-bend, and an 1880 Royal commission granted to Thomas Crapper & Co. to install thirty toilets(enclosed and outfitted with cedar wood seats in the newly purchased Norfolk county seat’s Sandringham House. His reputation lives on thanks to this diminutive term, used with the greatest guttural emphasis by the Scots – CRAP. The company is also still in existence, now selling luxury models of the original design.

Sanitary engineering combined with Nightingale’s emphasis on fastidious housekeeping, and cleanliness in hospitals, enforced by nurses with religious zeal, would change the world. And they did. But those same changes would take decades to reach crowded immigrant entry points in locations like New York City.

One historian described it this way, “As the City ascended from a small seaport to an international city in the 1899’s, it underwent severe growing pains. Filth, disease, and disorder ravaged the city to a degree that would horrify even the most jaded modern urban developer.”

One of the prime offenders was the noble work horse. By 1900, on the eve of wholesale arrival of motor cars, there were roughly 200,000 horses in New York City, carrying and transporting humans, and products of every size and shape, day and night along the warn down cobble stone narrow roads and alley ways.

It was a hard life for the horse, who’s lifespan on average was only 2 1/2 years. They were literally “worked to death.” In the 1800’s, 15,000 dead horses were carted away in a single year. Often, they were left to rot in the street because they were too heavy to transport. If they weren’t dying, the horses were producing manure – a startling 5 million pounds dumped on city streets each day.

As for human waste, sewer construction didn’t begin in New York until 1849, this in response to a major cholera outbreak. Clean water had arrived seven years earlier with the arrival Croton Aqueduct carrying water south from Westchester County. This was augmented with rooftop water tanks beginning in 1880. By 1902, most of the city had sewage service including the majority of the tenement houses. The Tenement Act of 1901 had required that each unit have at least one “water closet.”

As for the horses, the arrival of automobiles almost eliminated the “horse problem” overnight. Not so for cows, or more specifically the disease laden  “swill milk” cows.  Suppliers north of the city struggled to keep up with demand in the late 1800’s. To lower production costs, they fed their cows the cast off “swill” of local alcohol distilleries. This led to infections and a range of diseases in the bargain basement beverage sold primarily to at-risk parents and consumed by children.

Swill milk was the chief culprit in soaring infant mortality in New York City between 1880 and 2000. Annually there were some 150,000 cases of diphtheria, resulting in 15,000 deaths a year. A Swiss scientist, Edwin Klebs, identified the causative bacteria, Corynebacterium diphtheriae, in 1883. A decade later, a German scientist, Emil von Behring, dubbed the “Saviour of Children” developed an anti-toxin to diphtheria and was awarded the Nobel Prize in 1901for the achievement.

The casualties were primarily infants whose mortality rate in NYC at the time was 240 deaths per 1000 live births. Many of these would be traced back to milk infected with TB, typhoid, Strep induced Scarlet Fever and Diphtheria.

The process of heating liquid to purify it, or pasteurization, was discovered by Louis Pasteur in 1856, not with milk but with wine. Its’ use on a broad scale to purify milk first gained serious traction in 1878 after Harper’s Weekly published an expose’ on “Swill Milk.” But major producers and distributors resisted regulation until 1913 when a massive typhoid epidemic from infected milk killed thousands of New York infants. But diphtheria remained the most feared killer of infants.

As Paul DeKruif wrote in his 1926 book, The Microbe Hunters, “The wards of the hospitals for sick children were melancholy with a forlorn wailing; there were gurgling coughs foretelling suffocation; on the sad rows of narrow beds were white pillows framing small faces blue with the strangling grip of an unknown hand.”

One such victim was the only child of two physicians, Abraham and Mary Putnam Jacobi whose 7-year old son, Ernst, was claimed by the disease in 1883. Working with philanthropist, Nathan Strauss, the Jacobi’s established pasteurized milk stations in the city which coincided with a 70% decline in infant mortality from diphtheria, tuberculosis and a range of other infectious diseases.

By 1902, the horses hero status was reclaimed as it became the source of diphtheria and tetanus anti-toxins. The bacteria were injected into the horses, and after a number of passes, serum collected from the horse was laden with protective anti-toxins, relatively safe for human use. In 1901 alone, New York City purchased and delivered 25,000 does of ant-toxin funded by the Red Cross and the Metropolitan Life Insurance Company.

Corporations and Industrialization were front and center during this period. Chief among them was Bell Labs, situated at 24th Street on the West Side Highway overlooking the Hudson River. It would grow into the world’s largest industrial research laboratory with over 300 highly skilled employees. This was the opportunity scientific engineers and entrepreneurs faced – enormous risk and enormous promise.

Consider the X-ray. Its discovery is attributed to Friedrich Rontgen (Roentgen), a mechanical engineering chair of Physics at the University of Wurzburg. It was in a lab at his university that he was exploring the properties of electrically generated cathode rays in 1896. 

He created a glass tube with an aluminum window at one end. He attached electrodes to a spark coil inside the vacuum tube and generated an electrostatic charge.  On the outside of the window opening he placed a barium painted piece of cardboard to detect what he believed to be “invisible rays.”  With the charge, he noted a “faint shimmering” on the cardboard. In the next run, he put a lead sheet behind the window and noted that it had blocked the ray-induced shimmering.

Not knowing what to call the rays, he designated them with an “X” – and thus the term “X-ray.” Two weeks later, he convinced his wife to place her hand in the line of fire, and the cardboard behind. The resultant first X-ray image (of her hand) led her to exclaim dramatically, “I have seen my death.” A week later, the image was published under the title “Ueber eine neue Art von Strahlen”  (On A New Kind of Rays).

William II, German Emperor and Prussian King, was so excited, he rushed the physicist and his wife to his castle in Potsdam for a celebrity appearance and lecture on these “invisible rays.” The New York Times at 46st Street, less than a mile northeast of Bell Labs, was considerably less excited when they reported on January 19, 1896 on the lecture and Roentgen’s “alleged discovery of how to photograph the invisible” labeling the scientist “a purveyor of old news.” 

But one week later, on January 26, the paper had a change of heart, writing: “Roentgen’s photographic discovery increasingly monopolizes scientific attention. Already numerous successful applications of it to surgical difficulties are reported from various countries, but perhaps even more striking are the proofs that it will revolutionize methods in many departments of metallurgical industry.”

By February 4, 1896, the paper was all in, conceding that the “Roentgen Ray” and the photo of wife Anna’s hand had “nothing in common with ordinary photographs.” The following day, the Times used the term “X-rays” but never spoke of it again until Roentgen’s death in 1923, when the Times obituary called it “one of the greatest discoveries in the history of science.” And in 1901 Roentgen received the Nobel Prize in Physics. As for the profiteering spirit of the day, the German academic, Roentgen, never sought a patent on his discovery, feeling to do so would be unethical. He donated the 50,000 Swedish krona prize to the University of Wurzburg.

The years that followed would more than make up the difference as the original imaging spawned progressive biomedical engineering breakthroughs. Before the century drew to a close CAT (Computer Axial Tomography) SCAN, and MRI (Magnetic Resonance Imaging) would become household terms, and essential diagnostic tools and a profit centers for top medical firms. These included the imaging instrumentation and multinational conglomerate, Siemens, headquartered in Munich, Germany, just 173 miles south of Roentgen’s University of Wurzburg.

The early 20th century also featured the beginnings of the Medical-Industrial Complex. There was a clear realization at the time that, while an individual scientist might come up with a breakthrough, he or she had little chance of bringing it to market and satisfying demand without a corporate partner. A striking example at the time was the liaison between the Eli Lilly Company and the University of Toronto. In the first two decades of the 20th century, as life expectancy began to rise, the existence of diabetes became well known especially in suffering youngsters for whom it was a death sentence. 

The Lilly company, based in Indianapolis, decided to to focus on cures for metabolic disease and hired a brilliant medical researcher named George H.A. Clowes Jr. from the Gratwich Research Institute in Buffalo, N.Y. The attraction for Clowes, as a race car enthusiast, was the Indianapolis Speedway. The researcher traveled far and wide to medical meetings to discover the next best opportunity for Lilly. On December 28, 1921, while attending a Yale Symposium in New Haven, CT, he was in the audience for a presentation by Frederick G. Banting and JJR Macleod from the University of Toronto suggesting that they had isolated a chemical from pig pancreas that corrected diabetes surgically induced in laboratory dogs. He immediately phoned Eli Lilly Jr. to deliver a simple message, “This is it.”

At the University of Toronto, they spent the next year trying to figure out how they could mass produce and market the product on their own. While they were able to purify it, and prove its efficacy in a small trial of humans, they already could see that the demand would far outstrip the supply.  Clowes had been tracking their progress after being turned down a year earlier, and approached them again. This time he came prepared with a realistic plan to industrialize production along the design of Henry Ford’s assembly line and share the profits and a defined portion of the supply. In preparation, Lilly had linked to rail lines and a spur that could directly transported near unlimited quantities of pig livers directly into Lilly’s industrial synthesizers that were now capable of isolating and purifying the compound, originally termed “Isletin” and later insulin. It was the first of many lifesaving pharmaceuticals that would change the direction of chronic diseases in the century ahead.

“Optimism through technologic progress” was the motto of the day. The scientific community now knew that germs existed, and that cleanliness and sterilization could help prevent disease. They also possessed a growing appreciation that the body had some mysterious capability to fight diseases if exposed to them earlier. They were now primed to learn more. Their knowledge of life control systems, especially the circulatory and neurologic systems, had grown. They also were increasingly aware of a complex collection of specialized blood cells constantly on the move, with little understanding of their purpose. That blood, and the cells contained within, were useful somehow. This had been proven out by experiments transfusing blood in animal studies. The challenges in doing this – blood clotting and storage for reuse – were being addressed. But one problem, occasional recipient reactions, at times deadly, was troublesome.

Experimenters knew instantly that the problem must involve some mismatch between host and recipient. In 1901, an Austrian biologist named Karl Landsteiner was able to recognize protein and carbohydrate appendages (or antigens) on red blood cell surfaces which were physiologically significant.  He defined the main blood antigen types – A, B, AB and O – and proved that success in human blood transfusion would rely in the future on correctly matching blood types of donors and recipients. In 1923, he and his family emigrated to the U.S. where he joined the Rockefeller Institute and defined the Rh Antigen (the + and – familiar to all on their blood types) in 1937. For his efforts, he received the Nobel Prize in Physiology.

Human to human blood transfusions, from healthy to wounded serviceman, proved life-saving in WW I. But the invention of “blood banks” would not arrive until 1937. Credit goes to Bernard Fantus, a physician and Director of Therapeutics at Cook County Hospital in Chicago. A year earlier, he had studied the use of preserved blood by the warring factions in the Spanish Revolution. He was convinced that collecting donated containers of blood, correctly typed and preserved, could be life saving for subsequent well-matched recipients. His daughter, Ruth, noting that the scheme of “donors” and future “lenders” resembled a bank, is credited with the label “blood bank.” In his first year of operation, Fantus’s “blood bank” averaged 70 transfusions a month.

Additional breakthroughs came in response to the demands of WW II. Blood fractionation allowed albumin to be separated from plasma in 1940. Techniques to freeze-dry and package plasma for rapid reconstitution became essential to Navy and Army units in combat. 400cc glass bottles were finally replaced by durable and transportable plastic bags in 1947. And blood warming became the standard of care by 1963. By 1979, the shelf life of whole blood had been extended to 42 days through the use of.an anticoagulant preservative, CPDA-1, and refrigeration. Platelets are more susceptible to contamination and are generally preserved for only 7 days. The components preserved were also prescreened for a wide variety of infectious agents including HIV in 1985.

Karl Landsteiner in 1930 became the first of 28 members of the newly formed American Association of Immunologists (AAI) to be awarded a Nobel Prize for his discovery of human blood groups. Importantly, the resultant recipient/host RBC matching system all but eliminate deadly reactions from mismatched transfusions. But they still occasionally occurred, and no one knew why until 1958. That was when French scientist, Jean Dausset, was able to prove that the incompatibility was coming from blood antigens attached to leucocytes or White Blood Cells (WBCs) not RBCs. They were subsequently named “Human Leucocyte Antigens” or HLAs. Studies that followed reveled that they were remarkably specific to each individual resulting in what was termed the “HLA fingerprint.”

Discoveries around the human immune system continued as the electron microscopic revealed a range of cells and their functions and roles in fighting specific invaders like HIV. In general, it was clear that WBCs as a whole had evolved in humans to recognize, disable and dispose of the bad guys.

The field of Immunology is little more than a half-century old and still shrouded in a remarkable degree of mystery. Even describing what we do know is a complex challenge. One way to proceed is to climb the scaffolding provided by the wide array of Nobel Prizes in Physiology or Medicine over the last half of the 20th century.

Immunity has Latin roots from the word immunitas which in Roman times was offered to denote exemption from the burden of taxation to worthy citizens by their Emperor.  Protection from disease is a bit more complicated than that and offers our White Blood Cells (WBCs) a starring role. These cells are produced in the bone marrow, then bivouacked in the thymus and spleen until called into action.

They are organized in specialized divisions. WBC macrophages are the first line of defense, literally gobbling and digesting bacteria and damaged cells through a process called “phagocytosis.” B-cells produce specific proteins called antibodies, designed to learn and remember specific  invaders chemical make-up or “antigen.” They can ID offenders quickly and neutralize target bacteria, toxins, and viruses. And T-cells are specially designed to go after viruses hidden within the human cells themselves.

The first ever Nobel Prize in Physiology or Medicine went to German scientist, Emil von Behring, eleven years after he demonstrated “passive immunity.” He was able to isolate poisons or toxins derived from tetanus and diphtheria microorganisms, inject them into lab animals, and subsequently prove that the animals were now “protected” from tetanus and diphtheria infection. These antitoxins, liberally employed in New York City, where diphtheria was the major killer of infants, quickly ended that sad epidemic.

Where Jenner, and later Pasteur’s (anthrax) weak exposures prevented subsequent disease, von Behring’s antitoxin cured those already infected. More than that, it unleashed the passion and excitement of investigators (which continues to this day) to understand how the human body, and specifically its cellular and chemical apparatus, pull off this feat.

The body’s inner defense system began to reveal its mysteries in the early 1900s. Brussel scientist Jules Bordet, while studying the bacteria Anthrax, was able to not only identified protein antibodies in response to anthrax infection, but also a series of companion proteins.  This cascade of proteins linked to the antibodies enhanced their bacterial killing power. In 1919 Bordet received his Nobel Prize for the discovery of a series of “complement” proteins, which when activated help antibodies “drill holes” through bacterial cell walls and destroy them.

Scientists now focused as well on the invaders themselves, termed as a group, “antigens,” and including microorganisms and other foreign bodies. How did the body know the threat and respond? During WWII, the Allies rapidly developed a range of protective vaccines, and mandated that all soldiers be vaccinated. Their schedules were eventually adapted for peace times, and required for entry into public schools.

Victories against certain pathogens were hard fought. In the case of poliovirus, which had a predilection to invade motor neurons, especially in children, and cause paralysis, it required a remarkable collaboration between government, academic medical researchers and local community based doctors and nurses to ultimately succeed. The effort involved simultaneous testing in children of two very different vaccines. 

One was a killed virus that was administered by injection. This vaccine developed by Jonas Salk at the University of Pittsburgh, arrived with great fanfare in 1955. It was both safe and effective, but required skilled clinicians to administer it to over 2 million American children. 

The alternative vaccine developed at NYU by Albert Sabin was made available five years later in 1960. It was a weakened (attenuated) but still live virus that could be administered orally. Its’ disadvantage was that, in rare cases, it could actually result in polio. But its distribution, especially in impoverished nations made great practical sense. Both programs were fully funded by the non-profit National Foundation for Infantile Paralysis, a unique philanthropic arm created by FDR, a victim of the disease himself.

Current vaccine skeptics like RFK Jr. argue against historic facts. One need only to examine graphs of annual case loads for diseases like diphtheria and polio, before and after the introduction of vaccines, to appreciate the dramatic preservation of life that resulted from intentional  but safe exposure to killed or attenuated vaccines.

In this same era, scientific theorists like UK scientist Nils Jerne. were proven right. But it took three decades for the scientific community to agree. His 1984 Nobel Prize read, “He asserted that all kinds of antibodies already have developed during the fetus stage and that the immune system functions through selection. In 1971, he proved that lymphocytes teach themselves to recognize the body’s own substances in the thymus gland… An immunological reaction arises when an antigen disturbs the system’s equilibrium.”

By then, those Jerne’s WBCs had been termed “B lymphocytes” by an Australian scientist named Macfarlane Burnet, a 1960 Nobel laureate, who also saw antibodies already established in the fetus. These individuals were part of a long tradition of medical science imagineers. For example, Robert Koch’s main assistant was Paul Ehrlich, who imagined the inner workings of the cell this way, “In his eyes, cells were surrounded by tiny spike-like molecular structures, or ‘side-chains’, as he called them, and that these were responsible for trapping nutrients and other chemicals, and for drawing them inside the cell.”

The “side chains” were in fact antibodies, large protein molecules made up of two long and two short chains. It was later proven that roughly 80% of the four chains are identical in all antibodies. The remaining 20% varies, forming unique antigen bonding sites for each and every antigen. Almost immediately scientists began to wonder whether they could reconfigure these large proteins to create “monoclonal antibodies” to fight cancers like melanoma.

Imagination has occasionally carried the day. But more often direct problem solving uncovers answers. That was the case when French scientist, Jean Dausset described an “HLA fingerprint.” One question always leads to another. In this case, “Why do HLAs exist?” What was eventually uncovered was that certain microorganisms (viruses) take up residence inside human cells gaining protected status.  To deal with the problem, humans possess a specialized WBC – termed “T-cell.” We are familiar with them since they have been much publicized in our epic battle with the HIV virus. But for the T-cell to destroy an intracellular virus, it must “recognize and respond” to two messaging signals. First, the virus’s antigen. Second, a permissive signal that informs that the virus is housed in a host cell that deserves preservation. The fingerprint HLA is that signal.

The downside of course is that the body’s own cells under certain circumstances can trigger an over reactive immune response. Most of us have experienced a bee sting or peanut allergy gone bad. This alarming cascade of symptoms called “anaphylaxis” derives from the Greek ( ana– against, philaxis-protection), and clearly involves HLAs. The same is true of auto-immune diseases which may involve genetic variants of HLAs. Finally, successful organ transplantation relies on compatibility of donor and recipient HLAs

So to sum it all up, Immunology is a mysterious, complex, and evolving field of study.  Host and predators (including everything from a microorganism invader to a roque cancer cell, to a wooden splinter left unaddressed) could be fatal. But to respond the host must first identify the threat, and activate a specific and effective response, without inadvertently injuring the host itself. As our understanding has grown, harnessing the immune system to chase down metastatic cancer cells, or suppress a deadly rejection of a transplanted organ, or self-modify to avoid auto-immune destruction are clearly within our grasp in the not too distant future.

As importantly, the continued mining of cell theory and the evolution of tissue culture are now allowing progress in cancer research and unlocking the mysteries of immunology, the workings of virology, the creation of a range of life saving vaccines from polio to mRNA cures for Covid, but much, much more.

It is reasonable to complete this whirlwind survey with a brief tribute to the mRNA vaccine used against Covid that spared an estimated 20 million lives during the recent global epidemic. The scientific origins of the vaccine date back 60 yearsand the 1961 Nobel Prize, but its creation and distribution, which required just weeks as opposed to years, came “just in time.” 

As the NIH explained, “mRNA vaccines inject cells with instructions to generate a protein that is normally found on the surface of SARS-CoV-2, the virus that causes COVID-19. The protein that the person makes in response to the vaccine can cause an immune response without a person ever having been exposed to the virus that causes COVID-19. Later, if the person is exposed to the virus, their immune system will recognize the virus and respond to it. mRNA vaccines are safe and cannot alter your DNA, and you cannot get COVID-19 from the vaccine. mRNA vaccines may seem to have arrived quickly, but this technology is built on decades of scientific research that have made these vaccines a reality.”

We have come to the end of our time. But before we go to our “top 10,”  we must acknowledge the flurry of discoveries at the end of the 20th century. With the introduction of living cell cultures, and the use of the electron microscope, and the use of AI to reveal protein design, many of the inner workings of the cell have been revealed. Simultaneously, the field of biochemistry has matured, alongside the miracle of genetics. Side by side, in direct view, fertilization, embryonic development, multi-potential stem cells with timed specialization, organ development, and ultimately the Watson and Crick description (building on the work of Rosalind Franklin and Maurice Wilkins) of the DNA double helix in 1953 have opened the doors a half century later to the sequencing of the human cell genome in 2003 after a 13 year race to the finish line by competitors, then collaborators, NIH lead  Francis Collins  and the Celera Corporation CEO  J. Craig Venter.

And now, the promised “Top 10”:

10. The Thermometer & Tools to Measure

9. Tissue Culture & Antibiotics

8. The X-ray & Imaging

7. Anesthesia

6. Immunology & Virology

5. DNA & The Human Genome

4. Sanitation & Public Health

3. Germ Theory

2. Cell Theory

1. The Circulatory System