Mike Magee

In New York Times editorial writer, Adam Cohen’s book, “Imbeciles: The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck”, he writes, “,…in many ways, I believe you can learn more about an institution and more about an ideal like justice if you look at where it’s gone wrong rather than where it’s gone right.”

This very point is reinforced in the Webster definition for the noun, epidemic. There are two main (and distinctly different) definitions.

The entry reads:

epidemic noun

ep· i· dem· ic | \ ˌe-pə-ˈde-mik  \

Definition of epidemic 

1 : an outbreak of disease that spreads quickly and affects many individuals at the same time an outbreak of epidemic disease

2 : an outbreak or product of sudden rapid spread, growth, or development an epidemic of bankruptcies

In this course, sessions 1,2, and 4 focus on the first (and classical, microbe-centric) definition. But today’s third session focuses on “manmade” epidemics which fall under definition two, situations were human behavior (unintended or intended) creates or manufactures a crisis for humankind.

I thought long and hard about this choice. The deciding factor was the reading of Adam Cohen’s book which details the shameful story of the Supreme Court and  American Eugenics role in the state sponsored involuntary sterilization of Carrie Buck and 60,000 other innocents over four decades in America.

That story is the first of four cases we will cover in session three. The second details the overwhelming burden of chronic disease that was a byproduct of the Allied victory in World War II. The third case covers the hidden history of Arthur Sackler and his role in unleashing the Opioid Epidemic in America. Finally, we cover the history of the Attention Deficit /Hyperactivity Disorder (ADDH) and its relation to the epidemic abuse of the stimulant, Adderall.

We begin with the tale of Carrie Buck. The tragedy that enveloped here life, and the lives of some 70,000 other Americans, has roots that go all the way back to the Nile River Valley in the 5th millennium BCE. Here is where we find the first recored evidence of “pox” disease affecting humans. What would eventually be known as Smallpox, is in fact a relatively common viral infection of animals, notably sheep, horse, fowl, cows, goats and pigs. The offending viruses come in a number of varieties including Avipoxvirus, Leporipoxvirus, Orthopoxvirus and Parapoxvirus. The most notable, for its role in the discovery of a vaccine for humans is Cowpox or Vaccinia. Its’ cousin in humans, named small pox in the Middle Ages to distinguish it from Syphilitic larger size or “Great Pox” is caused by the Variola virus, part of the Orthopoxvirus genus.

In 1614, the Middle East and Europe were rocked by a epidemic of Smallpox. Following the Plague, its deadly disfigurement became a most feared pestilence. In the 17th and 18th century it appeared and reappeared as a scourge in the Americas. Native tribes were brought to the brink of extinction, first in the Caribbean and South America nations, and later in North America as well.

The rich and well-to-do were not spared. Ben Franklin’s 4 year old son contracted it and died, months after Franklin had refused to allow a new and still experimental inoculation which carried a 2-3% mortality, but also lifelong immunity to the infection that had a killl rate of 50%. The disease was with us literally since our inception as a nation. In 1633, 20 settlers from the original Mayflower died from it. Native Americans were especially vulnerable, and white leaders showed little mercy. Gov. John Winthrop pf the Massachusetts Bay Colony said “the Lord hath cleared our title to what we possess” – the hand of God if you will.

By the 18th century, all eyes were on a new method of suppressing this disease, called “variolation.” It proved a tough sell to independent minded colonists. Not so for General George Washington. After being whipped by British forces on the Canadian border who had all been immunized against Smallpox, leaving Washingtons fighting men vulnerable and soon overwhelmed by the disease, Washington made the scratching inoculation with purified crusts of the remnants of Smallpox infection mandatory throughout the Revolutionary Army in 1777. 

He had an easier time of it than did Massachusetts based Puritan minister Cotton Mather 50 years earlier during a Smallpox outbreak that ultimately claimed over 800 souls. During the crisis, one of Mather’s enslaved Africans named Onesimus informed him of the practice of innoculation. The disease was commonplace on slave ships at the time, triggering widespread warnings printed and distributed throughout the evolving colony. Mather, with a doctor friend Zabdiel Boylston, devised an experiment during an outbreak in 1721 that eventually claimed 14% of the population of Boston. Dr. Boylston’s young son was among 242 who were inoculated with only a 2.5% mortality. Rather than hail the success, locals were outraged, seeing the purposeful “infection” of citizens as evil witchcraft. One evening, in November, a malfunctioning bomb came crashing through Others window. Attached to it was a note that read, “Cotton Mather, you dog, dam you: I’ll inoculate you with this; with a Pox to you.’’

Resistance to the procedure was common in Europe as well. It had been popularized by Lady Mary Wortley Montagu, the wife of Edward, the British Ambassador to Istanbul. While overseas, her brother had died of Smallpox. She witnessed the procedure successfully performed in the Byzantine capital and insisted on immunizing her young son. Her survived with only minor symptoms. When she returned, she had Brititains court doctors repeat the procedure on her young daughter, Mary Alice, and memorialized it in a painting which she used as part of a nationwide effort to promote the procedure for the general population. She was less than successful, as was Napolean’s sister, who was an equally avid supporter in France in 1805.

The technique at the time was cumbersome and variable both in the materials subjects were exposed to, the tools utilized, and the science underpinning it. It was left to a British physician, Dr. Edward Jenner, to fill in the gaps. The official account is that he had noted that hired milkmaids on his farm appeared to be immune from contracting Smallpox. He had been inoculated as a school boy in the 1760’s. As an adult he surmised that their exposure to a related disease, Cowpox, endemic in the milk cows, might be transferring protection to his employees. As part of an experiment, he successfully inoculated a 13 year old on the farm, James Phipps, and exposed him to Smallpox later, which he did not contract. Later he also inoculated his 11 month old son. In modern times, the story of the milkmaids has been challenged for accuracy. What is undeniable is that in 1798, Jenner published “Inquiry into the Variolae vaccinae known as the Cow Pox” which laid out his theories. Eight decades later, Louis Pasteur added less to the experiment by explaining the science of this phenomenon which he described as vaccination in a nod to Jenner’s work with the “Vaccinia” Cowpox virus.

At the turn of the century, the medical community in America had largely caught up, but citizens remained wary of vaccination, especially if it was compulsory. In 1901, a smallpox epidemic swept through the Northeast, and Cambridge, Massachusetts reacted by requiring all adults receive smallpox inoculations subject to a $5 fine. Rev. Henning Jacobson, a Swedish Lutheran minister, challenged the law, believing it violated his right to “due process.” The case went all the way to the Supreme Court. On February 20, 1905, in a 7 to 2 majority ruling, the ruled against Rev. Jacobson penned by Chief Justice John Marshal Harlan. He wrote in part: “Real liberty for all could not exist under the operation of a principle which recognizes the right of each individual person to use his own, whether in respect of his person or his property, regardless of the injury that may be done to others.”

For most progressive policy elites, this was a “slam dunk.” What could possibly go wrong here? Of course, they had not fully considered the “unintended consequences,” let alone the presence of zealots committed to advantaging the narrow opening created by the new decision.

At the time, President Wilson and others were focused on “strengthening the American stock.” This involved a two-prong attack on “the enemy without” and “the enemy within.”

TheImmigration Act of 1824, signed by President Calvin Coolidge, was the culmination of an attack on “the enemy without.” Quotas for immigration were set according to the 1890 Census which had the effect of advantaging the selective influx of Anglo-Saxons over Eastern Europeans and Italians. Asians (except Japanese and Filipinos) were banned.

As for “the enemy within,” rooters for the cause of weeding out “undesirable human traits” from the American populace had the firm support of premier academics from almost every elite university across the nation. This came in the form of new departments focused on advancing the “Eugenics Movement,” an excessively discriminatory, quasi-academic approach based on the work of Francis Galton, cousin of Charles Darwin, who believed in “genetic determinism.” By 1921, most major universities in the U.S. had Eugenics Departments, and their own academic society, The American Eugenics Society.

The Greek roots for “good” and “origin” parented the term, which fittingly served those ultra-focused on eliminating undesirable traits in the human race. Not surprisingly, isolationists and segregationists, with a special focus on anti-miscegenation laws, picked up the thread and ran with it focused on vulnerable members of the community labeled as “paupers, mentally disabled, dwarfs, promiscuous or criminal.” Isolate, segregate, institutionalize, and sterile – that was the plan.

In a strategy eerily reminiscent of that employed by Mississippi Pro-Life advocates in Dobbs v. Jackson Women’s Health Organization in 2021, Dr. Albert Priddy, activist director of the Virginia State Colony for Epileptics and Feebleminded, teamed up with radical Virginia state senator Aubrey Strode to hand pick and literally make a “federal case” out of a young institutionalized teen resident named Carrie Buck.

Their goal was to force the nation’s highest courts to sanction state sponsored mandated sterilization, and thus allow the spread of the practice, state by state,  nationwide. This required a test case and a defense attorney for the victim, who was hired by Dr. Priddy. Carrie Buck was chosen as the target or victim.

In a strange twist of fate, the Dobbs name was central to this case as well. That is because Carrie Buck was under the care of foster parents, John and Alice Dobbs, after Carrie’s mother, Emma, was declared mentally incompetent. At the age of 17, Carrie, after having been removed from school after the 6th grade to work as a domestic for the Dobbs, was raped by their nephew and gave birth to a daughter, Vivian. 

The Dobbs actively covered up the circumstances knowing the disgrace could tarnish their future as “foster parents” let alone John Dobbs local town emloyment.The solution instead was to label Carrie as incorrigible and promiscuous, which they did. Her mandated institutionalization, where her mother was housed as well, followed and she was subsequently officially labeling an “imbecile,” a noun with legal standing at the time along with “moron” and “idiot.”

Dr. Priddy applied for official permission for forced surgical sterilization of the girl, and had his hand-selected defense attorney oppose his actions in court. To bolster his case for sterilization, Priddy enlisted the aid of a well-known Eugenics Professor, Harry Laughlin, who testified during the trial that Carrie suffered from “…social and economic inadequacy; has a record during life of immorality, prostitution, and untruthfulness; has never been self-sustaining; has had one illegitimate child, now about 6 months old and supposed to be a mental defective.”

In his majority decision supporting Dr. Priddy, Buck v. Bell,  Supreme Court Chief Justice Oliver Wendall Holmes leaned heavily on precedent. Reflecting his extreme bias, he wrote: “The principle that supports compulsory vaccination is broad enough to cover the cutting of Fallopian tubes (Jacobson v. Massachusetts 197 US 11). Three generation of imbeciles are enough.” 

Carrie Buck underwent tubal ligation against her will at the institution on October 19, 1927. What followed was predictable. By 1930, 24 states had passed their own laws allowing involuntary sterilizations. Between 1927 and 1974, 60,000 Americans were sterilized including 7,500 victims in the state of Virginia. That state’s sterilization law was repealed in 1974, and in 1980 an ACLU suit forced to make reparations. On May 2, 2002, Virginia Gov. Mark Warner publicly apologized for Virginia’s eugenics program.

Carrie Buck lived to age 76, had no mental illness, and read the Charlottesville, VA newspaper every day, cover to cover. There is no evidence that her mother Emma was mentally incompetent. Her daughter Vivian, was an honor student, who died in the custody of the John and Alice Dobbs at the age of 8. In an interview shortly before she died, Carrie confirmed that she had been raped, and also that she had longed for a small family of her own. Of the state’s treatment, she said, “They done me wrong. They done us all wrong.”

Circumstances, combined with competing agendas, ignorance and a sense of urgency, all contributed to the management of this 17 year old Virginia girl. But the rapid spread of the practice of forced sterilization and the labelling and segregation of members of society would not have been possible without the academic cover of Eugenics, and an aberrant take on legal precedent.

It is instructive, and sheds a damning light on our current Supreme Court and its’ recent decision,Dobbs v. Jackson Women’s Health Organization. In that modern day case, Jackson Women’s Health Organization, Mississippi’s only abortion clinic, sued Mississippi state health officer, Thomas E. Dobbs, for relief from the state law prohibiting abortion after the 15th week of pregnancy.

Chief Justice John Roberts sided with the minority in this recent case. That decision to let the Mississippi law stand unearthed a bucket of repressive state laws with more likely to come. But we have been there before. All one need do is view the portrait of Chief Justice Robert’s hero, Justice John Marshall Harlan, which can be found on the wall to the left of the fireplace in the Justices Conference Room in the U. S. Supreme Court.

Some of these same factors come into play in the second example of  “manmade epidemics”, the explosive chronic disease burden created by U.S. military policy during World War II. From the moment of Hitler’s launch of hostilities in Europe, Churchill was aware of Britain’s vulnerability and appealed urgently to FDR for help and war time supplies.

FDR’s hands were tied at the time by an isolationist Congress that had no interest in being dragged into what they viewed as a European dispute. But the U.S. President knew that U.S. involvement at some point in the future was inevitable, and began to shift America’s mighty productive capacity toward weapons of war. This was consistent with German General Edwin Rommel’s dictum that, “The battle is fought and decided by the quartermasters long before shooting begins.”

American industrialists were happy to cooperate with the conversion of automobiles to tanks and planes, and the movement of  commodities like rubber, sheet steel, and lead into specialized armament production lines – as long as the government was paying the bills.

But the larger problem for FDR was the human soldier gap in 1938 between Germany and the U.S. At the time Germany had 6.8 million active soldiers in 136 Divisions, making them the #1 Army in the world. The U.S. had barely 1/2 million soldiers in uniform, organized in 5 Divisions, earning a rank as #18 Army in the world. Adding to the challenge, those in uniform were neither battle-tested nor particularly healthy.

The experience in World War I had been an eye opener with large numbers of troops removed from the front lines as a result of psychiatric collapse or “shell shock” and a range of disabling general disease. FDR’s top commander, Gen. George Marshall, was determined not to repeat these errors. So when, mandatory drafts began in 1942, he made certain that the psychiatric bar was set high during draft exams, and liberally rejected applicants he saw as weak or vulnerable mentally including homosexuals, those with anxiety or depression, and those lacking adequate intelligence. The net effect was that 12% of all draftees were rejected by Draft Boards.

That would have been fine if all had gone well in the battle field, but it did not. In the first U.S. encounter with the Germans Tunisia, North Africa, 34% of the battle casualties were neuropsychiatric. General Marshall summoned his Head of Psychiatry, William Menninger, and directed him to set up a plan that would address the break down of morale in the field. Menninger’s solution was to put psychiatrists in uniform, close to the battle field, to treat battle fatigue, with heavy doses of rest and barbiturates, and return the drugged soldiers back to the front as soon as possible.

The problem was, there were not enough psychiatrists. So Menninger designed a 30-day training program that would convert everyday doctors into field psychiatrists. These “30 Day Wonders” focused on battle field mental health, and also treated general disease as an add-on. This included provision of condoms to all soldiers along with an emergency VD kit with anti-gonorrheal ointment in case of emergency, and a broad, ongoing marketing campaign that appealed to soldier’s “finer selves” on the one hand or safety and punishment on the other.

Venereal disease was such a potent threat to Army discipline that in the lead up to the war, The Supreme Court, with active support by the American Medical Association,  ruled against laws that restricted the marketing and sales of condoms and contraceptive devices that were now four decades old. Called the Comstock laws, these were brushed aside with the stroke of a pen by Chief Justice Charles Hughes’s Supreme Court.

Even if soldiers could be convinced to avoid unnecessary sexual risk, war was undeniably “hell on Earth.” The trauma, mental and physical, isolation, exhaustion, fear and dread were palpable and persistent. For the past century, keeping U.S. soldiers engaged, active, and committed to battle in the field had been a struggle. At the close of the Civil War, society had to absorb and treat over 1/2 million opioid addicts from the Union Army who were suffering from the long-term effects of “The Army Disease.” Unions leaders had provided troops over 10 million doses of opioids to keep troops from going AWOL.

American industry was more than happy to chip in. WW I Secretary of the Navy, Josephus Daniels, couldn’t keep enough coffee brewing to support his sailors caffeine habit. In response, industry created packets of “instant coffee.” Motivation and alertness was also an issue for General John J. Pershing’s soldiers in the muddy trenches of WW I. When they couldn’t roll their own cigarettes in the filthy terrain, Pershing asked for an industry response, famously saying, “You ask what we need to win this war, I answer tobacco, as much as bullets. Tobacco is as indispensable as the daily ration. We must have thousands of tons of it without delay.” He not only got what he asked for, but for the first time they came pre-rolled, with 20 packed side by side,  in an easy to manage pocket-size box.

By WW II, 8 out of every 10 soldiers and military doctors smoked cigarettes, and brands from Camels to Lucky Stakes, cloaked themselves in the American flag, happy to be branded as part of the war effort. Pills were also widely available. Especially in the early days of WW II, amphetamines were broadly used, until problems arose in jittery, trigger-happy soldiers shooting their buddies by mistake. As for depressants, alcohol use was rampant, with a regular ration of 1 quart of hard liquor a week for officers, beer runs for enlisted, and barbiturate for those in distress.

It is fair to say that returning U.S. soldiers were not a healthy lot. General Dwight D. Eisenhower could have served as “poster boy” for chronic disease. At the height of the war, he was chain smoking 4 packs of Marboro’s a day. In the two decades that followed his return from the European theater, he suffered four heart attacks and a variety of other debilitating ailments. And he was not alone. The average soldier returning from WW II had a lifespan that was 11 years shorter that a non-soldier match. 5% of those returning were legally disabled. Lung cancer was twice as common in Vets. Alcohol and drug abuse was through the roof. And state psychiatric institutions overflowed with over 1 million psychiatric casualties, most being treated with novel surgical procedures like frontal lobotomy and a wide range of new, and highly profitable anti-depressants and anti-anxiety drugs.

Finally, one additional exploding consequence of WW II. In 1938, FDR turned to an experienced science entrepreneur to ramp up scientific efforts in support of the Allied Forces. To say he was successful, is something of an understatement. As head of the Organization of Scientific Research and Development, he maintained a budget of $135 million with 18 Divisions and over 2000 investigators.

Vannevar Bush was featured on the cover of TIME magazine in 1942 with the headline, “Meet the man who may win or lose the war.” Under his direction, the U.S. developed the first Blood Bank, 18 different vaccines, emergency field use morphine, and a range of antibiotics including Penicillin. They also developed the first code-breaking main frame computers, Radar, and the A-bomb.

Less known, but with a far-reaching impact of its own, Bush created the integrated career ladder that became the profit-seeking backbone of the U.S. Medical-Industrial Complex. Under this system, academic physicians and scientists moved, without checks and balances, between academic hospitals, science corporations, and government agencies. Vannevar modeled this approach with his own executive team. His #2 assistant was none other than George W. Merck, of Merck Pharmaceuticals. Alfred Newton Richards, of the University of Pennsylvania, double-timed as a Merck Consultant before becoming the OSDR Chair of Vannevar’s Committee on Medical Research. Finally, John T. Connor, OSRD General Counsel, returned to Merck after the war to become their VP of Government Relations, with Vannevar Bush eventually becoming the Merck Corporation Board Chair for a decade.

So within the span of five years, American policy created a burden of disease that would challenge the nations care capacity for three decades into the future. It also move the nations health professionals toward specialized medicine and nursing, and encouraged long term in-patient hospitalization. And it endorsed and firmly established a medical-industrial complex with an integrated career ladder and profit and career advancement incentives that would challenge professional ethics. Finally, it created a mental health crisis second to none, and codified the overuse and mass marketing of pharmaceuticals as uniquely American.

The emergence of the Medical-Industrial Complex in 1940 also received a major boost from a physician who was not part of Vannevar Bush’s team. In fact, he was not in the military at all. His name was Arthur Sackler, a physician, who was born, the eldest of three boys, to Ukrainian and Polish immigrant parents. His official biography years later would describe him as rising up out of extreme circumstances, and chipping in to support and raise his struggling family, before reaching the highest ranks of academic medicine as one of their most altruistic leaders.

In reality, by 1938, he had earned his college and medical degree at NYU after graduating from the prestigious Erasmus High School where he rubbed elbows with the city’s elite. In that year, he also married the beautiful and successful Else Jorgensen, and after a year of internship, became the first U.S. employee of the German pharmaceutical firm, Schering Corporation, in 1940.

At the time, the U.S. was placing an embargo on the import of German goods. So Schering decided to open a U.S. branch and hired Sackler as their first Medical Director. Sackler had already demonstrated skills as a marketer and writer, heading up NYU’s Medical School Journal supported with outside  advertisements from drug companies and others. One of his first offerings was a special award to medical students who wrote papers on endocrinology, a specialty that liberally used some of Schering’s key products at the time.

He was well rewarded for his efforts, and in these early years, with his wife, began to collect Asian artifacts which would become a lifelong obsession. But in 1942, the U.S. government seized control of the Schering U.S.A., and Sackler was out of a job – but not for long. He recovered quickly, joining the William Douglas McAdams advertising agency in sales and marketing rather than the Army, and within five short years, while most doctors his age were overseas serving their country in war zones, he somehow gained a controlling interest in the firm. The source of his good fortune and funding remains a mystery. Clearly he was an opportunist. As thousands of shell-shocked veterans streamed into US mental hospitals during and after the war, he enrolled in a psychiatric residency at New York’s Creedmoor Psychiatric Center.

By the time he completed his psychiatric training in 1947, Sackler had divorced his first wife; two years later, he married his second, Marietta Lutze, the third-generation manager of yet another German pharmaceutical firm, and likely a rich source of business contacts for his growing advertising firm. By 1952, this son of an immigrant grocer in Flatbush, now a successful medical advertising man and twice divorced, was able to begin exercising in earnest his penchant for art collecting and for philanthropy, for which he would be best remembered. 

The period of Arthur Sackler’s rise was the same period in which other nations were choosing to more directly address the well-being of their citizens. During that same era, the US made critical decisions that would enshrine health care as just another business opportunity, and Sackler was the man for the moment.At the beginning of his career, traditional infectious diseases and childhood epidemics were declining as a cause of early death, but longer life spans prefigured an increased incidence of chronic diseases such as congestive heart failure and arthritis. Cigarette use, alcohol consumption, heavily marketed and easily prepared processed foods, anxiety, and depression, as well as the overuse of prescription medications, made heart disease and cancer deaths the number one and number two killers, creating heavy demand for the blockbuster drugs that Sackler would prove so adept at selling.First, though, came a midcentury gold rush in new antibiotics, which made the top pharmaceutical companies a fortune, with Pfizer in the lead, and Sackler as its lead advertiser.

In 1952, Sackler put a multipage insert in the Journal of the American Medical Association to push Pfizer’s new antibiotic, Terramycin, but here, too, the consummate spin doctor avoided using such a vulgar term as “advertising,” claiming instead that the drug company’s investment was purely for informational and educational purposes. Between 1950 and 1957, the “informational” revenue from the ads Sackler produced for JAMA for broad-spectrum antibiotics increased by sevenfold. Terramycin was marketed as a breakthrough, but it wasn’t unique. It was built by adding one additional oxygen atom onto Pfizer’s original tetracycline, Tetracyn.

During this period, nearly every JAMA issue also included, bound right in, an in-house magazine from Pfizer called Spectrum, developed by Sackler. In 1953, Pfizer followed Sackler’s advice and purchased the marketing juggernaut Roerig, a nutritional-vitamin supplement company, in order to mine its direct-to-consumer product profile and as a seed for what would become the greatest pharmaceutical sales force of the 20th century.

This is when the detail man began to flex his full marketing muscle. The drug company sales rep, typically a conservatively dressed, middle-age white man who matched up well with the typical physician of the time, would visit doctors’ offices to distribute glossy color publications extolling the virtues of the latest discoveries, along with a variety of branded giveaways such as pens, notepads, and coffee mugs. The most important giveaway, though, were small samples of the latest drugs the detail man represented. 

Not surprisingly, pushing drugs to the medical profession inevitably led to a sharp rise in drug dependency. The first wave was barbiturates, popular throughout the 1950s as sedatives, hypnotics, sleep aids, anticonvulsants, and anesthetics.18 By the end of the decade, as postwar America struggled to accommodate its new, more frenzied pace, one in seven adults would be using tranquilizers.

In 1957, Pfizer wanted to launch its new solution for anxiety, Atarax, and once again the marketing maven the company turned to was Sackler. Tranquilizers at the time were described as “ataractic,” which refers to a “mental and physical state of bliss.” On behalf of Pfizer, Sackler created a 13-minute “public service presentation” called “The Relaxed Wife. It featured an anxious husband dressed in pajamas and a silk bathrobe being comforted by his loving spouse. As the couple sat on twin beds suitable for Rock Hudson and Doris Day, the wife explained what she had learned about “tension” and “relaxation,” which was that “relaxation techniques don’t work for everyone.” Ten minutes in, the voice-over introduced the audience to the ancient Greek word ataraxia and went on to explain in terms that a guru from the sixties might appreciate that “doctors are now prescribing an ataraxic medicine” that offers the “calming peace of a cloudless sky.” Then the announcer gave a simpler directive that would become a mantra in the pharmaceutical business: “If you have tension problems, discuss them with your doctor.” Pfizer’s new drug, Atarax, was never named—and it didn’t need to be.

During the same era, Sackler, as head of the William Douglas McAdams agency, persuaded a number of drug companies to amplify their massive funding of continuing medical education by creating allegedly scientific journals, like the Journal of Clinical and Experimental Psychobiology, that published allegedly scholarly articles primarily devoted to experimenting with new drug therapies that were, in fact, lightly veiled advertisements for drugs developed by Sackler’s clients. Sackler hired ghostwriters for articles and attached the names of compliant physicians and scientists who were more than happy to become “authors” and “thought leaders,” which in turn led to handsome fees for giving speeches (arranged by Sackler) at medical meetings. But Sackler was hardly alone in his willingness to doctor science in pursuit of commerce. 

But the ne plus ultra of Sackler’s chutzpah may have been his recommendation to prescribe Valium, an addictive benzodiazepine, to people with no symptoms whatsoever, relying on the line: “For this kind of patient—with no demonstrable pathology—consider the usefulness of Valium.” One editorialist in the journal Psychosomatics responded, “When do we not use this drug?” Sackler went to near equal heights of chicanery in parsing the difference between Librium, launched in 1959, and its chemical cousin Valium, released in 1963, drugs both made by Hoffman–La Roche that worked in a very similar fashion to calm the nerves. Sackler managed to convince the world that these two products were entirely different medications for entirely different purposes. Librium was for “anxiety,” while Valium was for the quite different problem of “psychic tension.”53 Thus clearly partitioned, the two medications would not cannibalize each other’s markets, and Hoffman–La Roche would get two bites at the apple of emotional distress among medical consumers. By 1964, Valium had become the first $100 million drug, a then-staggering sales figure.

In 1960, Arthur Sackler launched Medical Tribune,a publication that, in time, reached more than a million physicians each week in 20 countries, publishing editorials that supported free enterprise and the unified political agenda of the American Medical Association and the Pharmaceutical Manufacturers Association. These columns promoted the drug industry’s friends and hammered its foes, castigated both regulators and the makers of generic drugs, and extolled unbridled research—all points of view perfectly aligned with those of the drugmakers.

Two years later, Sackler was brought before the Senate Judiciary Committee, where Senator Estes Kefauver questioned him about misleading and deceptive drug advertising. In his appearance before Congress, Sackler portrayed himself as a highly respected New York academician, a cutting-edge research psychiatrist, and a compassionate physician dedicated first and foremost to his patients’ welfare and to the ethical standards of the profession of medicine. But in reality, he was already the top medical marketer in the United States; the 1952 purchaser of the fledgling drug company Purdue Frederick and its star laxative, Senekot; a human experimenter in the use of unproven psychotropic drugs on hospitalized mental patients in his own branded research institute at the New York Creedmoor Mental Institution; a secret collaborator and partner with his supposed PR arch-competitor; and the secret owner of MD Publications and Medical and Science Communications Associates, which had FDA leaders on the payroll to ensure their support of pharmaceutical clients’ products.

 At least one Senate staffer understood at the time who Sackler really was. He prepared a memo that summed up the Sackler approach: “The Sackler empire is a completely integrated operation in that it can devise a new drug in its drug development enterprise, have the drug clinically tested and secure favorable reports on the drug from the various hospitals with which they have connections, conceive the advertising approach and prepare the actual advertising copy with which to promote the drug, have the clinical articles as well as advertising copy published in their own medical journals, (and) prepare and plant articles in newspapers and magazines.”

But Sackler was as slippery as the proverbial eel, even when confronting basic facts. Senator Kefauver asked him specifically about a small company called Medical and Science Communications Associates that was known for disseminating “fake news.” Even though the company shared the same Lexington Avenue address as the MacAdams Advertising Agency headed by Sackler, the doctor testified that he held no stock in MSC Associates and that he had never been an officer of it. This claim proved to be true, so Senator Kefauver moved on. What Sackler failed to mention was that the company’s sole shareholder was his former wife, Else Sackler.

Today, Sackler’s band of detail men, which he launched in 1952, is an army of more than 60,000 pharmaceutical sales representatives, servicing some 900,000 prescribers, and representing an industry investment of around $5 billion. Most are employed by the top 20 companies with combined 2016 US-based revenues of $316 billion. These new drug reps are 52 percent female and younger than the prior generation—one-third are former military—and evaluated more on their discipline and aggressiveness than their congeniality. Now they come armed with exact monthly statistics of how many prescriptions each of the doctors they “detail” has written in the past month for each of their company’s drugs, thanks to Sackler’s arrangement with the IMS-laced Physician Masterfile data from the AMA.

In all of this, Arthur Sackler, the master of manipulation, had shown the way. Sadly, the most dramatic legacy of the strategies he devised is in the domain of highly profitable addiction. Early on, in 1952, when he was still developing his playbook for capturing hearts and minds, he and his brothers acquired a small company called Purdue Frederick. Little more than a shell with annual revenues of only $22,000, it had been founded in 1892 to produce patent medicines. One of its big sellers was Gray’s Glycerine Tonic Compound, which consisted mostly of sherry.

In 1955, Purdue expanded to sell a brand of laxatives, and then an earwax remover. During this period, the Sackler brothers continued to actively experiment with the use of psychotropic drugs to address the symptoms of mental illness, and then they began to investigate what seemed to be an overlooked area—the treatment of pain. Purdue acquired a British counterpart called Napp Pharmaceuticals in 1964, which in turn acquired a Scottish drug producer, Bard Pharmaceuticals.Their primary interest in Napp and Bard wasn’t a specific drug but rather their prolonged-release technology called Continus which was suitable for administering morphine. 

At the time, highly addictive morphine sulfate was the mainstay of pain relief for surgical and cancer patients. The Sackler brothers felt that a slow release might be less addictive. In 1972, they patented a system called Contin in the US and began to sell a new slow-release morphine sulfate drug called MST in England. In 1987, they released a new and improved version called MS Contin in the US, and a decade later they modified the chemical formula to create OxyContin.

Arthur Sackler became a permanent silent partner when he died in 1987 at the age of 73. But Purdue remained very much a family business, run by his brother Raymond and a nephew, and operating very much in the Sackler tradition.

When the FDA approved OxyContin in 1996, no clinical studies to determine the risk of addiction had been performed, yet the FDA-approved package insert said that the drug was safer than rival painkillers because its delayed-absorption mechanism “is believed” to reduce the likelihood of abuse. That same insert was an invitation to an epidemic in its warning against ingesting the drug in any way other than the prescribed capsule form. People quickly discovered that if you ground up the pills and snorted them, or dissolved them in solution and injected them, you could get the punch of heroin.

The heyday of Arthur Sackler was a period of crossover in the methods used for the marketing of prescription drugs and the marketing of the drug delivery system known as the cigarette. Sackler’s counterpart in the tobacco industry, and fellow master of manipulation, was John Hill of the public relations giant Hill & Knowlton. Sackler and Hill shared authorship of such techniques as third-party advocacy, subliminal message reinforcement, junk science, phony front groups, advocacy advertising, and buying favorable “news” to report with advertising dollars. Thus the Medical Industrial Complex created an amazingly unlikely odd-couple alliance with the industry representing the very antithesis of health—tobacco. 

Which brings us to our final case of the day on the topic of manmade epidemics in America – Attention Deficit/Hyperactivity Disorder or ADHD and the Adderall epidemic. The treatment of overactive children with stimulants takes us back to 1879. That year Rhode Island socialites, George and Helen Bradley, gave birth to their only child, Emma Pendleton Bradley. George Bradley was a mining engineer who sat on the Board of AT&T, and was a confidante of Alexander Graham Bell.

The family’s charmed existence came to an end in 1886 when 7-year old Emma developed encephalitis leaving her with epilepsy and cerebral palsy. The Bradleys explored every avenue for research, care, and treatment of their daughter with few positive results. When George Bradley died in 1906, a year before his then 27 year old daughter passed away, he left his home in Providence, RI, and a large inheritance, to establish, support and staff the Emma Pendleton Bradley Hospital, the first psychiatric hospital for children in the United States. To begin on his wife’s death which occurred 12 years later, the Hospital finally officially opened in 1931.

A distant cousin, Charles Bradley MD, a pediatrician who had studied neurology in Philadelphia, became the 2nd Superintendent in the mid 1930’s. Dr. Bradley supported an aggressive neuropsychiatric research agenda at the hospital utilizing his young patients liberally as research subjects. Many of the studies required spinal taps, with leaks of spinal fluid resulting in chronic severe headaches in the children. To treat the problem, he prescribed benzedrine, an amphetamine derivative, to the children with the theory it would slow the spinal fluid leakage.

The original chemical compound itself was invented by Lazar Edeleanu, a Romanian chemist in 1887. But its popularity soared in 1932 when Philadelphia based Smith, Kline & French, soaked cotton strips in amphetamine oil and sold them as “inhalers” under the brand name, Benzedrine. Sales took off as the product was converted to pill form and it was misused for sleep disorders, depression and weight loss. As soldiers first headed off to WW II, they were initially supplied the drug for “alertness.”.

Dr. Bradley decided to try Benzedrine pills on 30 of his headache-afflicted charges, ages 5 to 14, in the hope that he could relieve the problems he had created with the spinal taps. The drug failed miserably as a pain reliever, but suddenly the kids taking the stimulant which they called “arithmetic pills”, embraced their studies. They began to perform better in school, and follow-up measurement confirmed that they appeared calmer as well. In November 1937, Bradley published his findings in the American Journal of Psychiatry.

Most adults don’t have to cope with exams, piano recitals, and homework assignments, but they, too, welcomed the boost Benzedrine provided. Popularity led to blowback, including a 1937 article in Time magazine called “Pep Pill Poisoning.”When college students from the Universities of Minnesota, Wisconsin, and Chicago collapsed from overdoses while cramming for exams, Smith, Kline & French began to retreat from marketing Benzedrine for use by children and focused instead on selling it for the treatment of depressed women, and this remained Benzedrine’s market for the next two decades.

To ameliorate the drug’s side effects, the company made minor changes to the formulation and rechristened the stimulant Dexedrine in late 1937. It was used by soldiers in World War II to promote alertness. But the new version still maintained the overriding drawback of its precursor—addiction. Winston Churchill personally authorized its use in British soldiers until the soldiers’ abuse of the drugs induced hallucinations that had tank drivers spinning in circles. Combining the drug with the barbiturate amobarbital to form Dexamyl, including an extended-release version in the 1950s, only made matters worse.

At about the same time, Harvard-trained psychologist Keith Conners, working at Johns Hopkins Medical School, went back to Dr. Bradley’s 1937 Benzedrine paper. He examined the research in detail and suggested to his boss, Leon Eisenberg, that they repeat the study. Soon they had a grant from the relatively new National Institute of Mental Health (NIMH). 

Conners and Eisenberg did their experiment in 1961 with the updated formulation Dexedrine, giving it to a group of African American boys who were said to be hyperkinetic and impulsive—descriptions that could apply to just about any kid at one time or another. Their experiment was done at a reformatory called the Boys Village of Maryland. Despite side effects, most notably loss of appetite and weight loss, the drug seemed to work, making the boys calmer and more compliant.

One year later, a replacement called methylphenidate hydrochloride was introduced that was supposed to be free of side effects. It had been created in 1956 by a CIBA company chemist, Leandro Panizzon, whose wife, Marguerite, was a tennis enthusiast looking for that extra oomph. Her nickname, Rita, provided the brand name—Ritalin. Although the company initially marketed the drug for the treatment of depression and fatigue—with a 5,000 percent markup—in time it pivoted to a novel marketing pitch to therapists and counselors. These clinicians, CIBA said, should give the drug to their patients before a session because it could “help psychiatric patients talk in as little as 5 minutes.”

Thus did ADHD and Ritalin come of age together, co-evolving along with renewed grants from the NIMH and cashable checks from a very supportive CIBA. Conners was recruited back to Harvard Medical School, where he began work on a measurement scale for the now fully endorsed “disease” of ADHD. In 1969, only one year after the most recent Diagnostic and Statistical Manual, the bible of psychiatry, had coined the term “hyperkinetic reaction of childhood,” he published his original 28-question “Teacher Rating Scale” for ADHD in the American Journal of Psychiatry. He would receive royalties for this testing instrument for the rest of his life.

According to the American Psychological Association, attention deficit hyperactivity disorder (ADHD) affects 5 percent of America’s youngsters, though nearly 15 percent of high-school-age boys have been labeled with the condition.Yet no blood test or imaging study is available to confirm the diagnosis; there’s just a weakly validated 39-question yes-or-no survey that’s distributed far and wide in pediatricians’ offices, through the media, and through public and private schools nationwide. 

When the diagnosis is broken down by gender, demographics, and geography, the distribution of ADHD becomes even more mystifying and disturbing. Rates can double and triple in areas, most notably Arkansas, Kentucky, Louisiana, and Tennessee, where schools promote the diagnosis and local physicians were willing to play along and prescribe.Meanwhile, the Centers for Disease Control and Prevention reports that among poor and disadvantaged two- to five-year-olds who carry the diagnosis of ADHD, more than 75 percent are placed on drugs, while only half ever receive “any form of psychological services.” 

New York Times investigative reporter Alan Schwarz calls ADHD the “most misdiagnosed condition in American medicine.” It is also the logical extension of the promotional methods pioneered by Arthur Sackler: Pay and promote the careers of compliant physician “thought leaders,” send sham patient-education materials into enabling institutions (in this case public schools), create quasi-medical associations and bogus pro-industry publications to liberalize diagnostic criteria for conditions, and expand drug use in treatment, ultimately relying on acquiescence from professional organizations—in this case the American Psychiatric Association (APA)—to support a disease category and the monetizing of it rather than finding less intrusive and more effective ways to promote children’s health and well-being.

Clearly some children benefit from drugs like Ritalin. The next step was the broadening of the definition of the malady, then overstating and over promoting benefits while minimizing risks. In the case of ADHD, this kind of “mission creep” has led to the astonishing belief, in many quarters, that the medicated youngster is actually better than he or she would be non-medicated. And patient advocacy groups are fully on board with the mission.

Encouraged by a million-dollar grant from CIBA pharmaceuticals (originally Chemical Industries Basel) in 1989, Children and Adults with Attention Deficit Hyperactivity Disorder (CHADD), with 34,000 members in 640 chapters, currently trumpets the “12 amazing superpowers” associated with hyperactivity on its website. According to CHADD, medicated juveniles multitask with a “laser focus” and score high on tests, a result no parent or teacher would object to. In 2012 comedian Stephen Colbert critically labeled the behavior “meducation.” The problem is that even when these pills deliver short-term, positive results, they short-circuit the child’s development of strategies that can provide long-term solutions and success as an adult. And as one might predict, anything in our culture that promises a quicker route to academic success is an invitation for illicit use.

With a deft touch that would have made Sackler proud, Conners described his study results this way: “The drug has energized the children, apathetic and discouraged by previous school failure, into making use of abilities available to them.” CIBA’s Ritalin ad stated simply, “Ritalin helps the problem child become lovable again.”

Of course, Conners’s scale was not only a lever for manipulation; it was an invitation for abuse. On June 29, 1970, the Washington Post introduced the nation to Byron B. Oberst, an Omaha, Nebraska, pediatrician whose office treated approximately 6,000 kids referred by a school administration instituting what it called a behavior modification program.The kids being “modified” attended the mostly black North Side public schools. The district head of health services justified the program by saying, “It makes them happier.” Dr. Oberst’s comment was that he was in the prevention business, as in preventing “vandalism, riots, and anarchy against society.”

In September 1970, congressional hearings revealed that Conners was on the receiving end of $450,000 in grants from the drug industry to explore minimal brain dysfunction. A particularly generous benefactor was Abbott Labs, whose new product Cyclert promised greater effectiveness and safety for hyperkinetic kids.

By 1975, portions of the medical establishment began to show concern that “minimal brain dysfunction” had become a catchall diagnosis for any child with even mildly nonconforming behavior. But a number of influential psychiatrists simply doubled down, saying the issue requiring their immediate attention was a brain-related disorder that interfered with “attention span.” By the time the 1980 DSM was printed, the condition had a new name, attention deficit disorder, accompanied by 16 definable traits that, once again, could describe just about any child having a bad day.

Supplementing CIBA’s $1 million under-the-table 1989 payment to CHADD, in 1995, a cooperative and compliant Department of Education provided another $750,000 to help the organization produce two ADHD information videos, one for parents and the other for schools. It also produced “A Child’s Call for Help,” a public service announcement that reached 19 million viewers and generated more than 100,000 follow-up calls to CHADD’s headquarters.

Harvard child psychiatrist Edward Hallowell’s influential and bestselling 1994 book, Driven to Distraction, advocated psychotherapy and coaching, in addition to medications. In 1996, writing in the Sunday supplement Parade, he referred to ADHD as “a good-news diagnosis.”And so it arguably was for some desperate parents who were at their wit’s end. On the other side of the issue were critics such as Dr. Peter Breggin, a Harvard-trained psychiatrist and consultant to the National Institute of Mental Health, who raised suspicion of ulterior motives on the part of educators: “Who’s suffering? These drugs alleviate the suffering of teachers in over-crowded classrooms.”

Public debate continued for another two years before the NIH decided in 1998 to hold a consensus conference on the topic headlined by Keith Conners, who had recently left Harvard for Duke. But Massachusetts pediatrician Mark Vonnegut offered the most succinct assessment: “The diagnosis is a mess.”

The 1948 patent on Ritalin that had allowed the 5,000 percent markup had long since expired. The market continued to grow, however, and so did the search for new patentable pharmaceuticals that could be used to “treat” ADHD-labeled children. Ironically, the race was won not by an eminent scientist but by a former Lederle Laboratories detail man named Roger Griggs.

In the early 1990s, Griggs had started his own firm, Richwood Pharmaceutical, with the strategy of acquiring the rights to sell already patented drugs. Initially, he focused on Rexar’s Obetrol, a weight-loss product that in 1991 earned a meager $40,000.93 Its side effects in adults included anxiety, tension, and sleeplessness, but its effects in children had not been studied. What salesman extraordinaire Griggs noticed that others missed was the unusual number of scripts being written by an Utah pediatrician named Ron Jones. When Griggs investigated, he discovered that Jones was prescribing Obetrol to children off-label for ADHD. More than that—the pediatrician claimed a 70 percent success rate with the drug when used on kids who had failed a trial of Ritalin.

Griggs bought Rexar and with it control of the drug Obetrol, which he renamed Adderall (as in, “ADD for all”). Then, without even asking the FDA’s permission, he launched the compound as a “unique alternative” for ADHD. The FDA was not amused and aggressively moved to shut down his company, but in 1995, according to New York Times columnist Schwarz, the child of an unidentified but influential senator did well on Adderall after failing on Ritalin, and that was all it took for the FDA to approve the drug a year later.

Domestic Adderall sales by 1997 were $18 million. That year, Griggs was approached by Shire with an offer to buy Richwood for $186 million, and he took the money and ran. Others saw more staying power, even growth, buoyed by the results of an NIMH trial headed by none other than Keith Conners. This study, published in the Archives of General Psychiatry in 1999, compared Ritalin to cognitive behavioral therapy for ADHD and gave the edge to drugs, even though nearly half of those on medication alone showed no benefit.

And then the spiritual heirs of Arthur Sackler began to imagine the profit potential if only ADHD could be defined as a lifelong condition. Companies that targeted children with ADHD saw an upper limit of school-age patients at about 3 million, but if sales could be projected into adulthood, a universe of 10 million was entirely possible. Shire and its drug Adderall took the lead in advancing the cause of adult ADHD, but soon Novartis, having purchased CIBA’s Ritalin, and Johnson & Johnson, with its new once-a-day Ritalin-like drug called Concerta, joined the pack.

On June 18, 1994, the cover of Time magazine proclaimed, “Disorganized? Distracted? Discombobulated? Doctors Say You May Have ATTENTION DEFICIT DISORDER. It’s not just kids who suffer from it.”By the end of the decade, Shire was selling $250 million worth of Adderall each year. More contenders would soon arrive, and they would all pass through the Duke lab of Conners, whose construct of ADHD was so well established at this point that he had his own medical journal, the Journal of Attention Disorders.

By 2002, Adderall sales exceeded $1 billion; five years later the drug crossed the $3 billion threshold. Along the way, Shire and Johnson & Johnson received help from yet another corner of the Medical Industrial Complex that had become an ever more reliable partner in manipulation—academic medical centers trolling for clinical subjects to participate in NIH-supported studies. In 2007, an alarming print ad created by the advertising firm BBDO for New York University’s Child Study Center ran for two weeks in the New York City market. It stated: “We are in possession of your son. We are making him squirm + fidget until he is a detriment to himself + those around him. Ignore this + your kid will pay. ADHD.” But this copy was not written to sell a movie; it was written to enlist study subjects for medical research. After 3,000 email messages, 70 percent negative, were sent to the center’s director, Dr. Harold Koplewitz, he pulled the ad.

Shire commissioned a loosely referenced booklet for doctors’ offices, fronted by Denver psychiatrist Dr. William Dodson, who stated, “We know now that about 10 percent of adults have ADHD, which means you’re probably already treating patients with ADHD even though you don’t know it.” The 10 percent figure was unsubstantiated, and Dodson at the time was collecting speaker fees from three different manufacturers. Shire also commissioned a survey designed to prove that people with untreated ADHD were far more likely to drop out, get divorced, or become addicted. The company then underwrote a scientific version that was published in May 2004 by the Journal of Clinical Psychiatry. The Shire press release read, “Survey of Adults Reveals Life-Long Consequences of Attention-Deficit-Hyperactivity Disorder.”

In 2010, the CDC accepted a range for ADHD in kids to be between 7.8 percent and 9.5 percent. By 2016, it had approached 11 percent. But estimates like these, and new industry ads suggesting, “Drug Therapy for Parents’ ADHD Improves Kid’s Behavior,” had become too much even for Keith Conners. North Carolina, the home state of his treatment center at Duke, now had the highest percentage of kids on ADHD drugs in the nation—16 percent.

Ultimately, what brought Conners around was not criticism from professional colleagues but rather the plight of a family member. Conners began to express a different point of view after a May 2014 meeting with a local school superintendent, attended and recorded by the New York Times’ Schwarz.113 Conners’s grandson had been receiving a hard push by school officials to get diagnosed and treated for ADHD, and at last Connors, writing in the Huffington Post on March 28, 2016, admitted the obvious. “A vast proportion of [kids] on medication received an incorrect diagnosis,” he said. “Testing and funding is at stake and nobody wants these kids dragging down the numbers. School systems have developed some secret process—teachers have a way of talking to parents. And it’s not just teachers either; it’s school personnel. There’s a roundabout system because the incentives are for the school system to deliver better test scores, more end-of-school graduation rates.”

His grandson’s school superintendent dispassionately shifted the blame: “That sounds like a question for physicians. Because they write the prescriptions.” When asked how the kids got in to see those doctors, she said, “That’s a question for the parents.”Conners apparently didn’t press the issue further, perhaps in part because he was still collecting royalties for the use of his diagnostic scale.

Just as troubling as knowing that schoolchildren and young adults are being heavily medicated based on dubious criteria is the news that, as CDC epidemiologist Susanna Visser recently confirmed, more than 10,000 American infants age two and three years old are already receiving physician-prescribed psychotropic medication for ADHD.


As one might expect from the spiraling feedback loops seen within the Medical Industrial Complex, admissions for Adderall drug rehabilitation began to rise. Doctors focused on the opioid epidemic have been slow to acknowledge that their sloppy prescribing has unleashed another front in the internecine war on drugs. But as British imperialists learned with opium in China long ago, nothing succeeds quite like addiction when you’re looking to develop a customer base. 

Since 1966, the annual consumption of stimulants by Americans has expanded fourfold; for Adderall, Ritalin, Concerta, Strattera, and their generic offspring, consumption has grown tenfold. We are just over 4 percent of the global population, and yet we consume close to 90 percent of the world’s prescription-level stimulants. An estimated 16 million American adults use prescription stimulants.

The widespread and growing expansion in the US of the ADHD diagnosis, and the explosive abuse of Adderall and growth of treatment programs to rescue the addicted, reflects more than just a consumer over-appetite for pharmaceuticals, sloppy physician prescribing, and pharmaceutical greed fed by direct-to-consumer advertising. It is rather a tangible expression of the perverse outcomes that one might expect when health care is considered a business opportunity rather than a public good, when health planning is conflated with health profiteering, and when complexity and obfuscation are deployed with the intent to monetize in situations where simplicity and transparency are essential.

In 2022, 15% of males and 9% of females were abusing Adderall in college.