Episode 36: Filth Parties

The southern United States was hit by a dramatic epidemic of a mysterious disease called pellagra in the early twentieth century. This episode discusses the cultural and scientific sources of the outbreak — from the cotton fields of the south, to the cow pastures of rural Germany, to the river basins of Uganda — and the incredible lengths a young doctor named Joseph Goldberger went through to try and put an end to this plague. Plus, a new #AdamAnswers about the source of the name “internal medicine.” All this and more on episode 36 of Bedside Rounds, a tiny podcast about fascinating stories in clinical medicine!


Cornbred, Peas, and Black Molasses, and performed by Sonny Terry & Brownie McGhee:

  • Bean WB,  “Origin of the Term Internal Medicine,” N Engl J Med 1982; 306:182-183
  • Blevins SM and Bronze MS, Robert Koch and the ‘golden age’ of bacteriology, Int J of Inf Dis, Vol 14, #9, Sep 2010.
  • Bloomfield AL, “The origin of the term ‘internal medicine,” JAMA, April 4, 1959.
  • Bressani R et al, Corn Nutrient Losses, Chemical Changes in Corn during Preparation of Tortillas, J Agr and Food Chem, 6, 10, 770-774.
  • Brim CJ. Job’s Illness: Pellagra. Archives of Dermatology and Syphilology. 1942;45:371-6.
  • Carpenter KJ, The relationship of pellagra to corn and the low availability of niacin in cereals, Experientia Suppl. 1983;44:197-222.
  • Clay K et al, Rise and Fall of Pellagra in the American South.
  • Elmore JG and Feinstein AR, Joseph Goldberger: An Unsung Hero of American Clinical Epidemiology, Ann Intern Med. 1994;121:372-375.
  • Goldberger J. The transmissibility of pellagra: Experimental attempts at transmission to human subjects. Public Health Rep. 1916;31:3159–73
  • Goldberger J. Public Health Reports, June 26, 1914. The etiology of pellagra. The significance of certain epidemiological observations with respect thereto. Public Health Rep. 1914;29(26):1683–1686.
  • Goldberger J, Wheeler GA, Sydenstricker E. A study of the relation of diet to pellagra incidence in seven textile-mill communities of South Carolina in 1916. Public Health Rep. 1920;35(12):648–713.
  • Goldberger J, Waring CH, Willets DG, et al. The Treatment and Prevention of Pellagra. Washington, DC: Government Printing Office; 1914.
  • Goldberger J, Wheeler GA. Experimental pellagra in the human subject brought about by a restricted diet. Public Health Rep. 1915;30(46):3336–3339.
  • Harris HF: Ankylostomiasis in an individual presenting all of the typical symptoms of pellagra. Am Med 1902; 4:99-100, retrieved from:
  • https://babel.hathitrust.org/cgi/pt?id=uc1.c3312358;view=1up;seq=107;size=125
  • Lavinder CH, Pellagra, The American Journal of Nursing, Vol. 13, No. 10 (Jul., 1913), pp. 746-754.
  • MacNeal WJ, The Alleged Production of Pellagra by an Unbalanced Diet, JAMA. 1916;LXVI(13):975-977.
  • Middleton J, Pellagra and the blues song ‘Cornbread, meat and black molasses’. J R Soc Med. 2008 Nov 1; 101(11): 569–570.
  • Mooney et al, The Thompson-McFadden Commission and Joseph Goldberger: Contrasting 2 Historical Investigations of Pellagra in Cotton Mill Villages in South Carolina. Am J Epidemiol. 2014 Aug 1; 180(3): 235–244.
  • Morabia A (2006). Joseph Goldberger’s research on the prevention of pellagra. JLL Bulletin: Commentaries on the history of treatment evaluation.
  • Niles GM. Pellagraphobia: A word of caution. JAMA. 1912;58:1341.
  • Roberts CS, Goldberger and the Mal de la Rosa, Clinical Methods: The History, Physical, and Laboratory Examinations. 3rd edition.
  • Searcy GH: An epidemic of acute pellagra. Transactions of the Medical Association of Alabama, 1907, pp 387-393
  • Wacher, C. (2003). Nixtamalization, a Mesoamerican technology to process maize at small-scale with great potential for improving the nutritional quality of maize based foods.


This is Adam Rodman, and you’re listening to Bedside Rounds, a tiny podcast about fascinating stories in clinical medicine. This episode is called Filth Parties, and it’s about the pellagra epidemic that struck the American South in the beginning of the twentieth century, and the incredible lengths that that a young doctor named Joseph Goldberger went through to try and convince the a disbelieving country about how to stop the disease.


In the early 20th century, the southern United States was hit with a mysterious new plague — pellagra. The first case was noted in 1902, but by 1906 had become commonplace in the South. The first symptoms would have been light sensitivity leading to a sunburn-like rash, especially in the hands and chest. But this sunburn wouldn’t heal, and would slowly spread across the body, destroying your hair as it went. Next would come abdominal pain and diarrhea that wouldn’t stop. Slowly, affected patients — who were called pellagrins — would develop emotional disturbances, memory loss, problems with their nervous system that led to difficulty walking, and finally dementia. If untreated it would progress invariably to death. This epidemic burnt across the South like a biblical plague; from 1906 to 1940, there were an estimated three million cases, and 100,000 deaths.  In 1928, at the height of the epidemic, more Southerners were killed by pellagra than malaria. 


We know today that most pellagra is caused by a deficiency of the vitamin niacin, with a subset caused by tryptophan deficiency, which is converted to niacin in the body. Niacin, or vitamin B3, is essential in the production of nicotinamide adenine dinucleotide, or NAD, and it’s been a really long time since I was forced to learn biochemistry, but I believe NAD and its phosphorylated form are cofactors in pretty much every cellular process in the human body, which is why pellagra has such diffuse and disastrous effects on the human body. Fortunately, niacin is fairly common in most diets, so for much of human history pellagra was not epidemic, though presumably would happen in times of famine. 


But there’s one particularly delicious — and common — staple that is particularly lacking in niacin and puts people at risk for pellagra. I’m talking of course about corn. Fresh corn has some niacin, but the modern process of making cornmeal removes most of the germ, which contains most of the vitamins. In fact, the Southern epidemic was likely caused by the invention of the Beall Degerminator in 1902 to more effectively grind corn. Corn cultivation started in the area that is now Mexico, and spread throughout North and South America. But as far as we know, the people of the corn never suffered from pellagra. The reason appears to be the traditional method of preparing maize in these cultures, a process called nixtamalization. Essentially, corn kernels are soaked in limewater, and then mashed up into a paste used to take tortillas or tamales. Submersion in a very basic material not only makes the kernels easier to mash into a masa, it also makes the remaining stores of tryptophan and niacin far more bioavailable. The invention of nixtamalization appears to be quite old — tools relating to its use have been dated at 3500 years ago. Obviously the details are lost in history, but it’s a quite remarkable example of social evolution that allowed for the flourishing of agriculture. And make no mistake — this energy-rich staple fueled the massive population growth of the Aztec and Inca empires — at the time of the Spanish conquest, the Inca Empire likely numbered about 16 million — larger than the Spanish empire that would destroy it. 


European conquistadors took corn back with them to Europe– and transplanted it into other conquered territories, especially in Africa. But they took the crop but not the nixtamalization, which would not cross the Atlantic. The Europeans would grind corn into cornmeal and remove the germ, essentially treating it like wheat. This new crop was incredibly popular, and for obvious reasons. It has the highest kilocalories per acre of any of the staples, and is reasonably weather resistant. Poor farmers the world over started to plant it, especially in Southern Europe. And likely soon after corn started to displace more traditional foods, pellagra started to rear its head. Nosology, or the classification of diseases, was in its infancy, and pellagra was probably lumped in with the general category of “leprosy” for the decades and centuries that followed. Fun aside — there’s an academic article, which I’m including in the shownotes, that argues that the suffering of Job was actually pellagra. Pellagra was finally noticed by the medical world in 1735, when a Spanish doctor, Gaspar Casal, working in the town of Oviedo in Asturia in northern spain noted stereotypical skin findings of Asturian peasants who ate mostly a diet of corn. He called the condition “la mal de la rosa,” or the Rose Sickness, named for the red rash on the hands and feet. And interestingly enough, his description of this new disease has been considered to be the first described medical syndrome in history. Casal, in the early 18th century, was still operating under a humoral model, and explained this curious disease in terms of weather — and diet, in particular the “Indian corn” that had been newly imported to the Mediterranean. Even today, the pigmented rash in the shape of a collar seen in patients with pellagra is called either the Casal collar or the Casal necklace.


After Casal, physicians started to identify the disease throughout Southern Europe, especially in Northern Italy, where it was called the Asturian leprosy or Alpine Scurvy. That is, until Francesco Frapolli, a Milanese doctor, created a new name for the disease — vulgo pelagrain, or pellagra in Italian. Different medical dictionaries disagree of the etymology — pelle is skin, and “agra” means sour, so “sour skin” is a common translation, but “rough skin” is apparently a better translation. The OED suggests that it’s the word skin, patterned on the disease “podagra,” or a gout flare of the big toe. In any event, pellagra ended up sticking. 


Casal’s original observations were quite astute — the association with cooler temperatures, and of course with corn consumption — even if he explained these with the four humors. But the dawning of the 19th century showed such humoral-influenced ideas fall out of favor. In its place, a new theory was developed about the disease called “zeism,” coming from the scientific word for corn. The idea was that there was a toxin itself in the corn that caused the disease, caused by putrefaction. Doctors felt pellagra was the corn version of ergotism in wheat — we now know it’s a toxic fungus, but in the early 19th century, it was thought to be a toxin from putrefaction. You’ve probably heard about ergotism before, since the hallucinogenic effects get rolled out to explain any number of strange historical happenings. But it causes St. Anthony’s Fire, which on the surface is very similar to pellagra — skin findings, abdominal distress, and of course bizarre behaviors.


By 1876, doctors had pretty much figured it out — putrefaction of corn caused the disease pellagra, probably with some photoactivation by sunlight, which explained the rash in sun-exposed areas. The experiments of Louis Pasteur had shown that microorganisms were responsible for putrefaction, and some forward-thinking zeists felt that the putrefaction was caused by harmless microorganisms, as in brewing. But of course, those microorganisms themselves couldn’t cause disease, because that’s a ridiculous idea. And, I’m sure you can see where this is going, but soon the foundations of medicine would fall out beneath everybody’s feet, and pellagra — and lives of literally millions of patients with the disease — would be caught in the lurch.


I’m talking, of course, about germ theory. I’m taking a very long time to get to the pellagra outbreak in the Southern United States, I know, but I want to linger here for a little while. I’ve talked about the development of germ theory a lot on Bedside Rounds, but I’ve never really addressed it head on. And really, because I don’t know how I could tell the story to do it justice, at least, not yet. But I want to talk briefly about Robert Koch’s discovery of anthrax, because of some great historical parallels the pellagra debate.


So it’s important to point out that no single person invented germ theory. The theory of contagion — that disease is spread via “germs”, which were thought to be chemical substances and not living things, had been described by Fracastoro in the 16th century as an alternative to the miasma theory. Miasma had generally won out, but by the mid 19th century contagion found new life with the discovery of “animalcules” in water — protozoa. Henle called this “contagium animatum” — contagion by microorganisms, though he did not identify any disease-causing organisms. During the same period, Pasteur had proved that bacteria caused putrefaction, and basically tossed the baby of abiogenesis — life from nothing — out with the bathwater. And finally, Joseph Lister was using carbolic acid to sterilize surgical fields.


But germ theory was far from accepted — and in fact, was the minority viewpoint, until Robert Koch. Koch was a promising young physician in Germany. I love this story — shortly after he and his wife were married, she gave him his first microscope, which became one of his most cherished possessions. He had actually done his thesis on biochemistry, not pathology. His first job was as a district medical officer in Wollstein, Germany — essentially doing public health. And the first order of business was anthrax. An anthrax outbreak was terrorizing the district — sure, 528 people had died — but over 56,000 livestock had been killed, and this was a major source of wealth for the district. Koch was aware of a previous study that strange, rod-shaped structures had been seen in the blood of affected animals, but they had previously been dismissed. Koch set up a makeshift laboratory in the back of his house — and yes, using the microscope his wife had given him — and set about experimenting on these rods. He isolated them, and then inoculated them into different animal, causing anthrax. He then was able to reisolate them, and inoculate them again — propagating out multiple generations. This tiny lab, with Koch’s anthrax-infected mice, is where his famous postulates were developed. I’ve talked about them before, but it’s worth a refresher. Koch’s postulates were updates to Henle’s theory of contagium animatum, and mirrored his work with anthrax.


First, a microorganism has to be found in all individuals suffering from the disease, but not in healthy individuals. 


Second, you have to be able to isolate and grow the microorganism outside the body.


Thirdly, this you re-introduced this pure culture back into health individuals it must cause the disease.


And finally, the organism was be re-isolated from the newly infected individuals, and be the same as the first.


It’s worth mentioning that, even though Koch’s postulates are still taught in medical schools, and is a remarkable framework for germ theory, they’re not really all true. Even Koch realized this later in his life in his work on cholera — we know there are plenty of asymptomatic disease carriers, perhaps the most dramatic example being Typhoid Mary. 


By the way, he photographed his plates with anthrax, the first images ever taken of a bacteria. 


Koch and his contemporaries immediately knew his anthrax experiment was ground-breaking — he had essentially proved germ theory. For the first time, a specific microorganism had been linked to a specific disease. In 1876 he published his findings, at the ripe old age of 32. The ensuing decades saw any number of old diseases were caused by microoganisms, bacteria, but soon also protozoa. It would take considerably longer to prove the existence of viruses. Tuberculosis and cholera — both also discovered by Koch, whose personal motto, by the way, was nunquam otiosus, never idle — but also syphilis, erysipelas, endocarditis, rheumatic fever — the list goes on, but for the purpose of pellagra, also African sleeping sickness. So by 1902, when pellagra would strike the United States, the idea that disease was caused by infectious microorganisms was ascendent. 


The appearance of pellagra in the United States is remarkable — because unlike when Gaspar Casal was working in rural Spain, the United States of 1902, was well equipped public health agencies, medical journals, and an excitable press. And thanks to the wonders of the internet, much of it is easily retrievable online —  including all the primary sources I’m using here. The first case of pellagra was noted by in a relatively obscure medical journal by a Dr. H.F. Harris of Atlanta, Georgia, who first met the first patient, known by his initials MW, on March 8, 1902. And I’ll just say, after all the time I spent tracking this one down, knowing everything that would happen after, I literally got chills when I realized google had digitized it. So MW was a poor, unmarried white farmer living in Appling County, Georgia. It’s a primarily agricultural area — even today, the largest town is only 4,400 people. Even by the relatively poor standards of the area, MW was impoverished — his father had died of sarcoma from a young age, and as a child he was forced to work on the farm for his livelihood. Harris notes that he had relied on “Indian corn” as the major part of his diet since a young age, but swore it had always been fresh and never rotten. When MW was a teenager, he started to fall ill and lose weight in the winter and spring, before recovering in the summer months. By the end of spring he would develop severe constipation and a blistering rash over his hands, arms, and the dorsal surfaces of his feet. Harris before an incredibly detailed physical exam, including chemical tests of urine, microscopic exam of the stool, and early blood analysis — and noted the stereotypical rash with a “brownish hue” on the face, neck, hands, and lower parts of the arms. The only other unusual finding was a large number of hookworm eggs in the stool — unfortunately a common finding in the south at that time. 


Harris then presents his conclusion: 


“From the foregoing it is seen that this patient presents with the typical symptoms of pellagra — a disease which is now generally believed to be a result of eating fermented Indian corn. If this be a genuine example of the disease it is the first case of the kind that has been reported in the United States”


Besides treating him for hookworms, he was very pessimistic: 


“Should the patient prove to have pellagra the disease is so far advanced that nothing could probably be done for him, but I have recommended that he change his place of residence to a cooler climate if possible, and that in the future he should be extremely careful not to eat decomposed Indian corn.”


The conclusion here is fascinating. Harris is a zeist through and through, but he still can’t help himself in recommending the humoral cooler climate, showing just how long old ideas can still how sway, even after their intellectual foundations have been long swept away. 


Harris’ article apparently went unnoticed in the contemporary medical press. But in the meantime, the disease rapidly became an epidemic. The next major article was published in 1906, when George Searcy, a doctor in Tuscaloosa, Alabama, was called in to investigate a mysterious outbreak at the Mount Vernon Insane Hospital — a sanitarium for black citizens of Alabama. Remember that this was the Jim Crow South — “separate but equal” and all. What Searcy discovered there was deeply concerning — he noted 88 cases of pellagra, with 57 deaths — a mortality rate of 64%. Searcy ran through the case records of the institution, and realized that cases likely started appearing in 1901, but only reached epidemic proportions a few years later. 


Searcy’s detailed observations would greatly influence future investigations of pellagra. A minority of the cases were men — only 10%. He also noted that no nurses had the disease — despite having very close contact with the patients. And he correctly fingered diet as the likely causal agent. Basically, he discovered that the only difference between the nurses and the patients was their diet; the patients ate the far cheaper corn, but the nurses would eat bread and biscuits made of wheat. And what’s remarkable — Searcy actually performed a a rudimentary clinical trial on a group of healthy patients, keeping a small group on a corn-based diet, and switching everyone else to wheat. When one patient in the corn diet fell ill with pellagra, this was enough for Searcy to draw his conclusion.


He also excitedly took a sample of the corn from the hospital and sent it off to a plant pathology  laboratory in Washington, D.C. The report came back conclusively — it was spoiled such as to be unfit for human consumption. Apparently the corn harvest in the spring of 1905 in the midwest was very wet, and Searcy closes his paper quite optimistically by asserting that this was the likely cause of this freak outbreak.


But that did not happen. Over the next year, pellagra was reported in dozens of prisons and institutions throughout the south, before spilling over into the general population. Doctors and public health authorities realized quite quickly they had an epidemic on their hands. In 1909, the National Conference on Pellagra was held in Columbia, South Carolina. I’ve read through the transcripts, which were published in JAMA, and they make for fascinating reading in retrospect. It was an international conference, with both American and European experts in full attendance. Zeism is alive and well — there are reports of aspergillus and other funguses being isolated from corn — perhaps making the unidentified toxin. But a new and exciting theory is mentioned as well/ In many ways, pellagra appeared similar to tuberculosis. It struck people in poverty, it was progressive with diffuse effects on the human body, and most importantly it appeared to be clustered in close contacts — could pellagra be an infectious disease? 


But what type of infectious disease would it be, with no microorganism identified to satisfy Koch’s postulates? This is where we talk about sleeping sickness. There was a reason I brought it up earlier. Sleeping sickness is a pretty nasty disease in parts of Africa caused by a protozoa called trypanosoma brucei, which is spread by Tsetse flies. A protozoa had never been proven to cause disease in humans before, but the Italian-English epidemiologist Louis Sambon had theorized that the tsetse fly was a vector of the unknown infectious agent by comparing the disease distribution of the flies and the disease. He would be proven correct the next year when the protozoan and Tsetse vector were definitely proved by David Bruce, hence the brucei. Fun fact — brucellosis is also named after him, the reason “Bruce” shows up so much in infectious disease. 


So when Sambon was sent to Italy to study pellagra, he used the same methods to determine that pellagra was an infectious disease. Essentially, he argued that the pellagra appeared to have a topographic distribution, with a higher incidence in lower areas that were closer to the water. This distribution roughly overlapped with the habitat of flies from the simulium species. Furthermore, the seasonal distribution of pellagra — the spring and the fall — coinces with times of high simulium activity.


Based on these observations, he confidently predicted that pellagra was caused by another species of trypanasome, as yet undiscovered, transmitted by the simulium species. The medical community took these predictions very seriously. I mean, it wasn’t such a leap — this was essentially the same methodology he’d use to finger the Tsetse fly. A Dr. Roberts, writing excitedly in JAMA, declared “the parasite of pellagra is a discovery to which we may look forward in the near future. Certain it is that the corn theory is not in accord with the facts, and must die the death of unfounded theories.” I’ll throw this out there as well — Sambon also thought that cancer was caused by protozoans. So at the very least he was consistent.


So to sum up, there were two warring camps at the conference in Columbia — the zeists, on the side of rotten corn, and a smaller contingent suggesting an infectious disease. But there was a third theory, that wasn’t mentioned at all in the proceedings — which turned out to be the correct one, that of a nutritional deficiency.


Nutritional deficiencies were not exactly obscure at the turn of the 20th century. I feel like I talk about James Lind every other episode, but by the early 20th century that scurvy was caused by an unknown nutritional deficiency was well accepted, though it would still be a few more decades until vitamin C was discovered. But the best model was probably beriberi, which we now know is caused by thiamine deficiency. The discovery of beriberi is worth an episode in itself, but by 1910, the disease had been identified as being caused by a diet high in polished rice, because of some as-yet unidentified nutritional factor called anti-beriberi factor. And then in 1912, Casimir Funk gave these two theorized compounds the name “vital amines” because of their proposed similarity to amino acids, or more famously vitamines. The ending e would be dropped a decade later. For a variety of reasons, the anti-beriberi factor, as a shorthand, became called “vitamine B”, now vitamin B1. In fact, in 1913, he published an article that argued that pellagra was caused by just such a deficiency. But this was met with deafening silence and does not appear to have made any contemporaneous impact.


So that was the medical debate. But it’s important to remember the effect disease had on southern society. After the end of the civil war, the southern economy had largely reorganized on a sharecropping model. Because these poor tenant farmers did not own their own land, the landlord largely decided what to plant. And by the late 19th and early 20th century, that was cotton, which largely displaced food crops in cotton-growing areas. As the sharecroppers plague spread across the countryside, pellagrophobia quickly followed. It struck in the lowest socioeconomic classes, and it was not pretty. People were treated like lepers. They would hide the disease from their neighbors, wearing long clothing. Children from affected households were denied entry into schools. And in hospitals, they were put on special pellagra wards, far from the other patients. That is, if they were admitted at all — a Dr. Niles of Atlanta wrote an editorial in JAMA in 1912 condemning pellagrophobia, and shamefully mentioning a hospital in Atlanta that even refused to treat patients (which he does not name, either out of politeness, or because he assumed everyone already knew).


Clearly something had to be done — and the United States government stepped in to intervene. By 1914, just five years after the pellagra conference, there were two warring teams of investigators, both supported by the government, and both with drastically different conclusions about the cause of pellagra.


 I’ll talk about the Thompson-McFadden Commission first, funded by the two rich cotton businessmen it was named after, but backed directly by Congress. These researchers descended on six “company towns” in the Spartanburg, South Carolina area in 1914: Inman Mills, Whitney, Pacolet Mills, Saxon Mills, Arkwright, and Spartan Mills. The researchers visited every household in these villages and took detailed demographic histories. They also did “housewife recalls” — dietary surveys asking about the consumption of cornmeal, grist, wheat flour, fresh meat, cured meat, lard, canned foods, milk, eggs, and butter. They then determined pellagra status of everyone in households — which was defined as having a stereotypical skin lesion, or being treated by a doctor for pellagra.


The Commission analyzed the data — and making explicit comparisons to beriberi, they decided to reject a nutritional cause. Their reasons? Unlike in beriberi, the disease was never found in nursing infants. And perhaps even more importantly, their data showed that pellagra was more common in the white citizens of these towns than the black. In the Jim Crow south, it was naturally assumed that the black diet would be worse. By this same logic, they rejected the zeist hypothesis of toxic corn.


Then the Commission decided to test the infectious hypothesis. They divided all houses in these towns into three zones — zone 1 was infected with pellagra, zone 2 was adjacent, and zone 3 was not adjacent. When they analyzed their data, they foun  d an inverse relationship — disease incidence dropped the further you got from affected households. Furthermore, between towns, having an improved sanitation system was protective against the disease. Based on this, the investigators concluded that Sambon was in fact correct — pellagra was an infectious disease, though the microorganism itself was still at large.


Enter Joseph Goldberger, from the United States Public Health Service. I’m just going to have full disclosure here — I have a bit of a doctor crush on Joseph Goldberger, so my apologies if I gush. Goldberger was a child of immigrants, born in Hungary in 1874, and raised in New York City. He went to Bellevue for medical school, and went into private practice for a couple years, before he got bored and joined the United States Public Health Service Corps. He was stationed at Ellis Island and won a sterling reputation for his studies on yellow fever, dengue, and typhus. If you’re one of my American listeners, and your relatives entered the country in this period, as some of mine did, they likely crossed paths with Goldberger.


So in 1914, when the USPHS was authorized to address the disease, Goldberger, then in his mid 30s, was the logical person for the job. 


And here’s what I find incredibly remarkable. Before starting his task, Goldberger performed a literature review — pretty much the studies that I’ve been talking about here — AND HE PRETTY MUCH FIGURED IT OUT.


In his review, he pointed out that pellagra was an exclusively rural disease and followed a seasonal pattern. It was associated with poverty — but didn’t affect cattle farmers in the same area, who had access to milk and meat. It was associated the the “three Ms” that made up the Southern diet — cornmeal, meat, salt pork in particular, and molasses, even in diets with sufficient calories.  And perhaps most convincingly, it was never acquired by the caregivers of pellagra patients in institutions or hospitals.


Based on these, he concluded that the disease was caused by a nutritional deficiency, likely due to over-reliance on the southern diet, and especially cornmeal — basically a beriberi of corn and not rice.


As Goldberger sat down to decide how exactly to prove this, he knew he had a long road in front of him. He was a Yankee — and a Jewish Y ankee from New York City at that — who realized he had to travel into the South and convince people that the traditional southern diet was what was making people sick. I was raised in North Carolina, and even in the late 20th century, I can tell you that “Yankee” is not generally a term of endearment. Given that the disease was already stigmatized, you can see why the Thompson-McFadden Commission’s infectious conclusion was more attractive.


He settled on the strategy of “dazzle em with data,” and in 1914, Goldberger set off to rural Georgia to two orphanages, called in his study by their initials, MJ and BJ. Both had high concentration of the disease — there were 210 cases between the two of them. The living conditions at the orphanages left much to be desired, but Goldberger requested that nothing be changed — he only wanted to modify the diet. Unfortunately, he noted, the logistics of the orphanage prevented a control group. He would have to modify the diet for the entire orphanage. With this limitation in mind, he set about his experiment — he increased the amount of milk, buttermilk, eggs, beans and peas in the diet, while decreasing the amount of cornmeal. Because of the lack of a control group, the goal of his experiment was only to lower the recurrence rate. Both orphanages were observed for a year, and the effects were as dramatic as Goldberger had hoped for — the 210 who had pellagra the last year were cured, with only one recurrence, and of the kids who had not had pellagra, there were no new cases. He repeated this same experiment at a Georgia hospital for the mentally ill, with equally impressive results among institutionalized adults. 


Goldberger had cured — well, technically prevented recurrence — of pellagra in a controlled setting. But it didn’t make a difference. Widespread skepticism at his nutritional hypothesis remained. In fairness, he had made a methodological mistake — by cutting down the amount of corn, rather than only increasing other foodstuffs, also argued for a zeist interpretation. Depressingly, the federal subsidies ran out for extra food for the orphanages and the asylum in 1916, and pellagra viciously struck back with recurrence rates of up to 40%. But even that grim experiment didn’t sway skeptics.


So having cured the disease with diet, he now decided to do the opposite — use food to cause pellagra. Goldberger approached Earl Brewer, the progressive governor of MIssissippi for permission to use a prison as an experimental setting. At the Rankin State Prison Farm, Goldberger recruited 12 healthy volunteer prisoners. If they would eat a traditional southern diet for 6 months and run the risk of developing pellagra, they would get full pardons from the governor. It was a motley group — 7 convicted murders, but also two wealthy brothers politically-connected to the governor who were serving terms for embezzlement. I’m sure you can see where this is going. The prisoners were housed in a separate, impeccably clean dormitory. Goldberger wanted there to be no question about sanitary conditions leading to infection. The normal prison diet was actually pretty good — meat at every meal, with buttermilk, beans, and peas. But the volunteers were fed a close approximation to the traditional southern diet. A breakfast menu might include biscuits, fried bush, gravy, grits, cane syrup, and coffee. Lunch was collards, cornbread, syrup, and sweet potatoes. Dinner was similar. It doesn’t sound bad to me — for a day. I can’t imagine a year of that.


And the experiment was a roaring success. One inmate started to develop signs of pellagra within two weeks, and by the end of the study, 7 of the 12 had the disease, as determined by independent dermatologists — Goldberger wanted to avoid calls of bias. And the prisoners hated the diet — again, which was basically the go to diet in the rural south. A few tried to end the experiment early — which would be no pardon, and one is quoted as saying, “I have been through a thousand hells!”


The medical community was actually quite impressed by this study, though it did little to quell the infectious skeptics. “Nobel prize” began to be bandied about. But society at large still rejected Goldberger’s hypothesis. He was accused of torture, and some openly suspected that the entire venture had been fabricated as a way to give pardons to the governor’s friends.


Goldberger was speaking the language of epidemiology and science. But he also realized he realized that people aren’t swayed by controlled experiments. He needed something more dramatic to speak to the public at large. He needed Filth Parties. 


This is the part of the episode where I give my first content warning. The following contains pretty gross descriptions of eating human excretions. If you are eating lunch right now, you might want to pause and come back after you finish.






Goldberger had asked a lot of his subjects — but he also demanded a lot of himself. Working with infectious diseases at Ellis Island, he lived with a daily risk of infection with a serious disease. I suspect this fact made him come up with the idea of using himself and his colleagues as experimental subjects. Goldberger had long pointed out that the fact that nurses and workers at institutions where pellagra was endemic did not get sick was a major challenge to the infectious theory. He now intended to prove that it was impossible.


Goldberger, members of his research team, and notably his wife Mary, all volunteered to throw “filth parties” where they would, basically, do everything they could to “infect” themselves with pellagra. They collected a variety of human excrement — feces, blood, phlegm, scrapings from rashes.


And then … well, I’m just going to read from the study directly:


“The feces specimen was obtained with the aid of a simple water enema and was liquid.

The scales (by which he means the scraping of rashes) with about 4 cc of each specimen of urine and with about the same quantity of the liquid feces were worked up into a pilular mass with wheat flour and in this form swallowed by volunteer G-J (that’s Goldberger’s not-so-subtle way of pointing out that he’s doing this himself), 30 minutes after taking 20 grains of sodium bicarbonate and about 1-1.5 hours after collecting. 


So basically, at the filth party, Goldberger worked a pile of rash, feces, and urine into wheat flour. He then took sodium bircabonate to reduce acidity in his stomach lest someone claim that the acidity had killed the organism. And then he ate it.


He held these parties for two months, and while plenty of people had diarrhea, or local reactions at the sites of injections, not a single person got pellagra. After the last party, Goldberger wrote in his journal, “We had our last filth party this noon. If anyone can get pellagra this way, we must certainly have it good and hard. It’s the last time. Never again.”


Perhaps the most scandalous thing about these filth parties was the inclusion of his wife Mary in the volunteers. She wrote years later that she had insisted on being included, and when her husband wouldn’t allow her to swallow feces, she negotiated that she be allowed to be stabbed in the stomach with 7 mL of blood from a woman dying of pellagra. Why this was seen as a safer option is behind me. In any event, Mary Goldberger was a tough lady — this was far outside the pale of what would be expected from a woman in the early 20th century. One of Goldberger’s nurses, in fact, who was assisting with the filth parties was so upset at watching the injection that she fled from the room crying.


It is now 1916. Joseph Goldberger has cured pellagra in orphanages and asylums, transmitted it with a southern diet is a prison work farm, and now eaten human feces and had his wife injected with blood. So I cannot even imagine his frustration when his experiments failed to sway public opinion. One director of a pellagra hospital in Atlanta fumed, “physicians all over the south are following his positions implicity, crucifying their patients on a cross of error.” Members of the Thompson-McFadden commission in particular pointed out that Goldberger’s experiments only seemed to work because it took a weakened constitution for the infection to take hold. The prisoners had gotten sick because the southern diet had weakened them; Goldberger and co. did not get sick because they were young and healthy. And to prove all this, they pointed to their study of Southern Mill towns. The people who had the sickest were younger, poorer women. How could explain that?


Goldberger didn’t take this lying down, and threw shade back the best way a scientist knows how to — by replicating their clinical trial, but doing it better. So in 1916 he headed back down to the Spartanburg, SC area, stealing their methodology and using a similar set of villages, but with three major changes. First of all, Goldberger instituted a strict definition for pellagra — a case would only be counted if the patient had a detailed physical examination, either by Goldberger himself or one of his physician colleagues. Examination by non-experts, or even just looking at doctor’s notes, wouldn’t cut it this time. Secondly, they didn’t only rely on so-called “housewife surveys,” but actually got receipts from the company stores so the researchers knew exactly what their subjects were eating. In this era before widespread automobile travel, the people of these small towns basically only had one place to buy their groceries. And finally, to test the hygiene hypothesis, Goldberger selected several villages that had improved sanitation, instead of just the one in the original study.


With this attention to detail, Goldberger was able to show that pellagra was not associated with either sanitation of poverty. And his data suggested that the greatest risk factor was the absence of a variety of sources of food, especially milk and meat. Goldberger concluded in the end that the reason women and young people appeared to suffer more was that they were less able to supplement their food outside the home. And income was protective because you were able to eat a wider variety of foods.


Why were the Goldberger and Thompson-McFadden results so different? A century later Dr. Mooney and his colleagues re-analyzed the data from the two competing investigations. Basically they found Thompson-McFadden Commission’s shoddy case definitions likely resulted in far more people being classified with pellagra than truly had it — many of the pellagrins likely had ariboflavinosis, another nutritional deficiency, but one that Goldberger’s stringent definition would have screened out. They also only included one village with modern sanitation — Golderberger purposefully oversampled to allow better intravillage comparisons. And finally, the Comission’s researchers were so intent on finding an infectious cause that they IGNORED their own findings which actually showed that milk consumption was protective.


Goldberger by now had considerable support from the federal government and the medical community, especially in the North. But by now he realized no level of experimental evidence would win over the hearts and minds of the people of the South. His theories were too threatening to the economic backbone of the region; he knew the political thicket he had wandered into. He needed a cure. He dedicated the remainder of his life to trying to identify the causative micronutrient. He got close — fingering tryptophan, which is actually a cause in a minority of cases. And along the way, he discovered the yeast supplements would prevent the disease, which he had the Red Cross distribute in the great flood of 1927, and which prevented a great amount of pellagra until, of course, the supplements ran out. But he would die of renal cell carcinoma in 1929 at the age of 54. Just 8 years after his death, Conrad Elvehejm would discover niacin and the cure for pellagra, leading to widespread fortification of niacin in foods in the 1940s, and the essential elimination of this plague — to the point that I imagine most American doctors have never seen a case.


So I mentioned the resistance of Goldberger’s ideas in the South. But certainly not everyone felt that way. Prisoners across the south, blacks and whites alike, were aware of Goldberger’s experiments, and the controversy they had caused. And they understood that the diet being served in the prisons was making them sick. There was an old call and response work song that grew out of this — “I don’t want no cornbread, meat, and black molasses.” And this would later be immortalized as a famous blues song. This is the version recorded by:


Here’s what gets me about this story — why do we not remember Joseph Goldberger? He’s basically the American John Snow. A quick refresher — he’s not from Westeros, because that’s what Google just tried to autocorrect me into writing, but probably the most famous epidemiologist of all time.  During the late 19th century, he mapped out a cholera outbreak in London and proved that the disease was waterborne,and against the London City Government and board of health, he actually remove the handle from the Broad Street Pump and stopped the outbreak. Like Snow, Goldberger used what were then innovative scientific investigations to prove his nutritional connection. He ate human feces and injected his wife with blood to try and prove it. And when people didn’t believe him, he set about finding a cure anyway. And by all accounts, he was a stand-up guy, worthy of emulation. So why has he faded into obscurity? Probably a couple of reasons. Because he died at such a young age, he didn’t get to see his research through, and the Nobel, which he definitely deserved, is not awarded posthumously. And let’s not forget that this was a period of growing anti-Semitism on both sides of the Atlantic. But I think Goldberger pushed up against an uncomfortable reality in America which still exists today — and that’s the effects of poverty on health. We like to believe that we’re an equitable society. And infectious diseases appear in the public imagination to be equitable. Tuberculosis kills the rich along with the poor. But Goldberger argued that poverty should be considered in the conquest of human disease. We can’t quash disease until the even out the social determinants of health. And even today, that’s still a bridge too far for some. 


Okay, that’s it for the show! But wait, it’s time for a #AdamAnswers. #AdamAnswers is the segment of the show where I answer whatever burning questions about medicine, no matter what they are. And today, we have another great one from Dr. Tony Breu, who asks, As an internist, I’d love to know when “internal” medicine became its own thing. What was originally meant by “internal”?


As an internist myself, I strongly sympathize. I’ve actually always hated the term “internal medicine” and internist in general. “What type of doctor is that?” is a common question I get when I describe what I do, plus “internist” sounds very similar to “intern.” 


So like many things in American medicine, the phrase internal medicine comes from German. So a quick refresher — the medical world at the end of the nineteenth century was dominated by the so-called “German School,” just as the beginning of the nineteenth had been dominated by the French. An estimated hundreds of thousands of American doctors went to Germany from roughly 1860 to the first world war, and brought these German methods back with them. But what exactly did they bring back? The major innovation of the French school was a combination of hospital teaching, bedside diagnosis, and clinical pathology. The German school took this model, but merged it with the university system. Medicine and basic science were joined together. It was a potent combination, and rapidly laboratory medicine, radiology, microbiology, and clinical pharmacology all developed as further tools for the modern doctor. Koch, Roentgen, Bayer — all come from this period. 


The phrase innere medizin — literally internal medicine — was in common use by the 1880s; the first congress of internal medicine was held in Leipzig 1882. The traditional explanation is that the word “internal” comes from the development of dermatology, as medicine started to splinter into specialties. The classical divide in medicine, going back millenia, was between physicians and surgeons. Physicians were educated, usually in the classics, and would treat disease with medicine and diet. Surgeons came from uneducated barbers,  and their practice was seen as a trade. From the 16th century onward, both fields had progressively professionalized. In the nineteenth century, the massive generation of medical knowledge prompted further splintering. Dr. Bloomfeld, who wrote the most quoted article of the subject in 1956 in JAMA, points to the development of the concept of “internal disease” in Germany, which explicitly excluded the surgical fields, but also dermatology. From here, it was a hop, skip, and a jump to “internal medicine.”


So when the German research university was imported wholesale to the United States, starting with Johns Hopkins in 1876, the phrase internal medicine was imported along with it.


The only problem — if you actually look at what those German doctors were writing, they certainly have a much broader conception of internal medicine than simply “everything but dermatology.” In the 1901 textbook Lehrbuch der Inneren Medizin, Dr. J. Von Merin defines internal medicine as “the accretion of the most various experimental disciplines [which has[] reached such a scope that one man can no longer be fully authoritative in all its branches.” And internist is a physician who “it competent to sift critically the endless accumulation of detail so that the best can be offered to students and general practitioners.”


Early American internists had the same conception of internal medicine. The first recorded example of the phrase “internal medicine” in the United States is, unsurprisingly, from William Osler in 1895 in an address to the American Academy of Physicians. He says, “The time has come when able young men should be encouraged to devote themselves to internal medicine as a specialty. Content to labor and wait during the first 10 or 15 years of professional life, with pathology as the solid basis of development, such men will pass to the wards through the laboratories thoroughly equipped to study the many problems of internal medicine”


He wrote an essay further describing his views. The internist is a specialist in the data of medicine; the laboratories, the gross anatomy lab, the clinical studies, in pharmacology, the epidemiological research — a consultant to the generalist, who was still largely treating the variety majority of disease. Or, as I’ve always called it, the “nerds” of medicine. There’s no evidence that he considered dermatology outside this scope — in fact, his use of “internal”, as well as the doctors in Germany, appears to be more philosophical. The internist gets “inside” a clinical problem using scientific data.


The American College of Physicians was formed in 1915 as the professional society for internists, which gives you an idea of when the idea really started to take hold in the United States. But the confusing title of “internal medicine” was nowhere near a done deal in those early days, and their were several attempts to rename the specialty. In fact, the first name of the ACP’s journal was the Annals of Clinical Medicine. It wasn’t changed to the Annals of Internal Medicine until 1927. 


And just as the 19th century had the French and German schools, the 20th was largely defined by the growing influence of American medicine. And we basically exported the field of internal medicine to the rest of the world, which is why you can study neike in Taipei or medicina interna in Santiago. 


In any event, I’m still proud to call myself an internist, even if I have to explain to other people what that actually is. Ok Dr. Breu — I hope that sort of answers your question. For listeners who are keeping score, that’s another notch in Dr. Breu’s belt for questions submitted to #AdamAnswers. He’s now tied with Dr. David Serota — Twitter handle @Serotavirus, just because that’s amazing — with most #AdamAnswers submitted ever. So for my other listeners who aren’t Drs. Breu or Serota — get submitting! If you have any questions you want answered, no matter how profound or silly, Tweet them to me @AdamRodmanMD!


And that’s really it for the show! Let me know what you thought of the episode. I went pretty deep into some of these primary source, to the point that I think I can safely list my number one hobby as reading through conference transcripts from the early 20th century. But it’s amazing to actually see the debate take shape over basically a two decade period. It makes me think of some of the MedTwitter debates that I stalk — like about the ORBITA trial, in stenting stable cardiac disease, or the debates about cancer screening. Some of them get pretty heated. One hundred years from now, some nerdy future podcaster — or whatever medium is delivered directly into your cortex — is going to have a field day pouring over Tweetorials.


Bedside Rounds has changed a lot over the years — I’m essentially putting out original research every month. But I do it for you guys — well, you guys, and the fact that I’m a huge nerd and love this stuff. If you like the show, please write me a review. You can listen to all previous episodes on the website at www.bedside-rounds.org, or on iTunes, Stitcher, Spotify, or the podcast retrieval method of your choice. I’m on  Facebook at /BedsideRounds and Twitter as AdamRodmanMD, where it seems like I mostly retweet Tweetorials, and nerd out about medical history.


All of the sources are in the show notes.


And finally, while I am actually a doctor and I don’t just play one on the internet, this podcast is intended to be purely for entertainment and informational purposes, and should not be construed as medical advice. If you have any medical concerns, please see your primary care provider.