How Tuberculosis Inspired the 19th-Century New England Vampire Panic

Jenkins via Flickr // CC BY-NC-ND 2.0
Jenkins via Flickr // CC BY-NC-ND 2.0

On March 19, 1892, the Evening Herald of Shenandoah, Pennsylvania printed a story describing what it called a “horrible superstition.”

A young man named Edwin Brown in Exeter, Rhode Island had been suffering from illness for some time. His mother and eldest sister had died from the same disease, then called “consumption” because of the way its victims wasted away (and now known as tuberculosis). Edwin traveled from Exeter to Colorado Springs—a popular destination due to its dry climate and specialized disease treatment centers—but his health did not improve. While he was away, his sister Mercy also became ill and quickly died.

When Edwin returned home after Mercy’s death, his health declined. His desperate father turned to an old folk belief: When members of the same family waste away from consumption, it could be because one of the deceased was draining the life force of their living relatives.

With a doctor and some neighbors in tow, Edwin and Mercy’s father exhumed the bodies of each family member who had died of the illness. He found skeletons in the graves of his wife and eldest daughter, and a doctor found Mercy’s remains, which had been interred for nine weeks and looked relatively normal in its decay.

However, liquid blood was found in Mercy’s heart and liver. Although the doctor said this was fairly standard and not a sign of the supernatural, the organs were removed and cremated before Mercy was reburied, just in case. But the exhumation and cremation did nothing for Edwin Brown’s disease: He died two months later.

Newspapers were quick to connect these folk rituals with vampire legends, especially those of Eastern Europe. Vampire stories from all over were printed on the front pages of 19th-century New England, describing similar rituals in distant locations. Like the New Englanders, people in remote parts of Europe were exhuming bodies when people fell ill, and burning or planting stakes in those that seemed too full of life.

But the New Englanders who took part in these rituals didn’t necessarily believe there was a supernatural cause of their family members’ illness, as author and folklorist Michael E. Bell writes in his book Food for the Dead. Although some may have harbored beliefs about vampires, many were simply desperate, and unwilling to leave untried any remedy that might save the lives of those they loved—even an outlandish or gruesome method.

Tuberculosis was entrenched in the Americas even before the United States existed as a country. President George Washington himself likely fought the disease after contracting it from his brother—ironically, on a trip taken to Barbados in an attempt to treat Lawrence Washington’s illness, according to medical historian Howard Markel of the University of Michigan.

Washington wasn’t alone. Other notable American sufferers of tuberculosis included James Monroe, Ralph Waldo Emerson, Henry David Thoreau, Washington Irving, John “Doc” Holliday, and Helen Hunt Jackson.

In 1786, when health officials first began recording mortality rates connected to the deadly infection, Massachusetts alone recorded 300 consumption deaths for every 100,000 residents. Between that year and 1800, tuberculosis killed 2 percent of New England’s population. In many cases, living in the same home was enough for the disease to spread throughout an entire family. It was estimated that anywhere from 70 to 90 percent of the U.S. population had latent or active tuberculosis infections.

Today, most people understand that tuberculosis is spread through the air, by breathing in bacteria coughed up by people with active infections in their lungs or throats. There are vaccines, though they’re rarely used in the U.S., and treatments for those who contract active tuberculosis infections.

In the 1800s, however, germ theory was only just beginning to gain supporters among the medical community. Doctors were still arguing over the causes of tuberculosis in 1895, and treatment mainly consisted of leaving large cities like New York and Boston, where the disease ran rampant, for places like Pasadena, California and Colorado Springs, where the climate was supposed to help ease the symptoms. Until the rise of the sanatoria movement (basically, rest-oriented treatment centers) at the end of the 19th century, few medical treatments worked. Even sanatoria only helped some patients.

As tuberculosis spread from the cities out into the countryside, people didn’t know what caused it or how to stop it. In some New England towns, such as Lynn, Massachusetts, it was the leading cause of death, Bell says. Entire families were wiped out, and there didn’t seem to be any rhyme or reason to who caught the illness.

It was not a pleasant way to die. Symptoms included wasting, night sweats, and fatigue, and a persistent cough that sometimes produced white phlegm or foamy blood. Occasionally, the cough turned into hemorrhaging. Those who caught it could not know if they would eventually recover, painfully waste away over the course of years, or die in a matter of months from the “galloping” form of the disease. If they did recover, there was always the fear that the illness would return.

“Cholera, plague, smallpox, yellow fever, influenza, and measles were fast-burning epidemics that appeared, killed, and then went dormant as immunities kicked in,” Bell says. Tuberculosis did not. It was an unrelenting fact of life in the 1800s. With no other explanations, people turned to the supernatural to understand the epidemic, and to offer hope of a cure.

Enter the vampire.

The vampire legend may have made its way into New England as an early version of the unproven “miracle cure” for tuberculosis. In 1784, a newspaper published a letter about a foreign “quack doctor” who had been spreading an unusual cure for consumption. According to the letter, when a third member of the Willington, Connecticut family of Isaac Johnson contracted the disease, the quack doctor advised him to dig up two family members who had already died of the illness. The bodies were inspected for any sprouting plants, and the letter writer—who said he was an eyewitness—reported that sorrel was found. The doctor advised the Johnson family to burn the sorrel with the vital organs to remove sickness from his family, an idea the letter-writer called an imposture.

But those who had lost multiple loved ones, and faced losing more, were willing to try anyway.

Anthropologist George R. Stetson later connected the New England beliefs to similar rituals from Russia, Hungary, Prussia, and Serbia, as well as other parts of Europe, ancient Greece and the Caribbean. In his 1896 article The Animistic Vampire in New England, Stetson described the case of one unnamed mason who credited his own health to the ritual. The man had two brothers who had contracted tuberculosis. When the first died, a respected member of the community suggested the family burn his vital organs to save the second brother. The second brother protested and the ritual wasn't done; he continued to sicken and die. When the mason got sick, the second brother was exhumed, and “living blood” was found. A cremation was held (it’s unclear if it was just the blood or the full body that was burned), and the mason soon recovered.

New England vampires were not the supernatural revenants of novels like Dracula, who rose from the dead as walking corpses to drain blood from the living, Bell told mental_floss. Instead, they were believed to drain the life force of their loved ones through some spiritual connection that continued even after death.

“The ‘vampires’ in the New England tradition were not the reanimated corpses, bodily leaving their graves to suck the blood of living relatives, that we know from European folklore, filtered through Gothic literature and popular culture,” Bell says. “New England’s ‘microbes with fangs’ (as one medical practitioner recently termed them) were, however, just as fearful and deadly as the fictional Dracula.”

If a body was exhumed and liquid blood could be found, or it seemed to be far better preserved than expected, one of a number of rituals were performed, including burning the corpse (and sometimes inhaling the smoke); rearranging the corpse or turning it upside down and reburying it; or burning vital organs like the heart and liver. Occasionally, Bell says, the ashes were consumed by family members afflicted with tuberculosis.

One of the more remarkable cases Bell has discovered is that of the Rev. Justus Forward and his daughter Mercy (no relation to Mercy Brown). In 1788, the minister had already lost three daughters to consumption; Mercy and another sister were fighting the illness. As Mercy Forward traveled to a neighboring town with her father one day, she began to hemorrhage.

Forward was reluctant to try opening the graves of his deceased family members, but allowed himself to be convinced, willing to do anything to save his daughter. His mother-in-law’s grave was opened first, without result. However, he soon found a grave that fit the requirements. Bell relays a portion of a letter written by Forward:

“Since I had begun to search, I concluded to search further ... and this morning opened the grave of my daughter ... who had died—the last of my three daughters—almost six years ago ... On opening the body, the lungs were not dissolved, but had blood in them, though not fresh, but clotted. The lungs did not appear as we would suppose they would in a body just dead, but far nearer a state of soundness than could be expected. The liver, I am told, was as sound as the lungs. We put the lungs and liver in a separate box, and buried it in the same grave, ten inches or a foot, above the coffin.”

The act didn’t save Mercy, Bell says, but Forward’s other children seemed to recover. And the willingness of Forward and his family to attempt the ritual impartially helped to relieve fear in his community, Bell notes: “He ultimately authorized a ritual that, in effect, reestablished social stability, essentially proclaiming that the dead were, indeed, dead once again.”

There were other cases, too:

At the end of the 19th century, Daniel Ransom wrote in his memoirs about his brother Frederick, a Dartmouth College student who died of tuberculosis in 1817. The boys’ father worried that Frederick would feed on the rest of the family, and had Frederick exhumed and his heart burned at a blacksmith’s forge. The cure didn’t work, however, and Daniel Ransom lost his mother and three siblings over the next several years.

In the 1850s, Henry Ray of Jewett City, Connecticut dug up the bodies of his brothers and had them burned when he, too, contracted tuberculosis. In a nearby case, a grave belonging to someone known only as “J.B.” was broken into—possibly by family members or friends, who often conducted the rituals—and the skeletal remains were rearranged into a skull and crossbones shape. Researchers speculate that it might have been done to stop J.B. from becoming a vampire, or because he was blamed for a living person’s illness.

Henry David Thoreau wrote of another case in his journal in September 1859: “The savage in man is never quite eradicated. I have just read of a family in Vermont—who, several of its members having died of consumption, just burned the lungs & heart & liver of the last deceased, in order to prevent any more from having it.”

These tales found their way into newspapers throughout the U.S., along with European tales of vampires, werewolves, and witches, reflecting the late 19th century’s fascination with the afterlife and the supernatural. Such stories from New England may even have inspired Bram Stoker’s story of Dracula.

The rituals continued until Mercy Brown’s exhumation in 1892, 10 years after Robert Koch discovered the bacteria that caused tuberculosis. Eventually, germ theory began to take hold, and contagion was better understood. Infection rates began to go down as hygiene and nutrition improved.

But until then, people were often willing to cling to any chance for themselves and their loved ones under the “gnawing sense of hopelessness” those with the disease lived with, Bell says:

“In short, for the pragmatic Yankee, the bottom line was, ‘What do I have to do to stop this scourge?’ The ritual was a folk remedy rather than an elaborated detailed belief system."

Now Ear This: A New App Can Detect a Child's Ear Infection

iStock.com/Techin24
iStock.com/Techin24

Generally speaking, using an internet connection to diagnose a medical condition is rarely recommended. But technology is getting better at outpacing skepticism over handheld devices guiding decisions and suggesting treatment relating to health care. The most recent example is an app that promises to identify one of the key symptoms of ear infections in kids.

The Associated Press reports that researchers at the University of Washington are close to finalizing an app that would allow a parent to assess whether or not their child has an ear infection using their phone, some paper, and some soft noises. A small piece of paper is folded into a funnel shape and inserted into the ear canal to focus the app's sounds (which resemble bird chirps) toward the child’s ear. The app measures sound waves bouncing off the eardrum. If pus or fluid is present, the sound waves will be altered, indicating a possible infection. The parent would then receive a text from the app notifying them of the presence of buildup in the middle ear.

The University of Washington tested the efficacy of the app by evaluating roughly 50 patients scheduled to undergo ear surgery at Seattle Children’s Hospital. The app was able to identify fluid in patients' ears about 85 percent of the time. That’s roughly as well as traditional exams, which involve visual identification as well as specialized acoustic devices.

While the system looks promising, not all cases of fluid in the ear are the result of infections or require medical attention. Parents would need to evaluate other symptoms, such as fever, if they intend to use the app to decide whether or not to seek medical attention. It may prove most beneficial in children with persistent fluid accumulation, a condition that needs to be monitored over the course of months when deciding whether a drain tube needs to be placed. Checking for fluid at home would save both time and money compared to repeated visits to a physician.

The app does not yet have Food and Drug Administration (FDA) approval and there is no timetable for when it might be commercially available. If it passes muster, it would join a number of FDA-approved “smart” medical diagnostic tools, including the AliveKor CardiaBand for the Apple Watch, which conducts EKG monitoring for heart irregularities.

[h/t WGRZ]

Does Having Allergies Mean That You Have A Decreased Immunity?

iStock.com/PeopleImages
iStock.com/PeopleImages

Tirumalai Kamala:

No, allergy isn't a sign of decreased immunity. It is a specific type of immune dysregulation. Autoimmunity, inflammatory disorders such as IBS and IBD, and even cancer are examples of other types of immune dysregulation.

Quality and target of immune responses and not their strength is the core issue in allergy. Let's see how.

—Allergens—substances known to induce allergy—are common. Some such as house dust mite and pollen are even ubiquitous.
—Everyone is exposed to allergens yet only a relative handful are clinically diagnosed with allergy.
—Thus allergens don't inherently trigger allergy. They can but only in those predisposed to allergy, not in everyone.
—Each allergic person makes pathological immune responses to not all but to only one or a few structurally related allergens while the non-allergic don't.
—Those diagnosed with allergy aren't necessarily more susceptible to other diseases.

If the immune response of each allergic person is selectively distorted when responding to specific allergens, what makes someone allergic? Obviously a mix of genetic and environmental factors.

[The] thing is allergy prevalence has spiked in recent decades, especially in developed countries, [which is] too short a time period for purely genetic mutation-based changes to be the sole cause, since that would take multiple generations to have such a population-wide effect. That tilts the balance towards environmental change, but what specifically?

Starting in the 1960s, epidemiologists began reporting a link between infections and allergy—[the] more infections in childhood, [the] less the allergy risk [this is called hygiene hypothesis]. Back then, microbiota weren't even a consideration but now we have learned better, so the hygiene hypothesis has expanded to include them.

Essentially, the idea is that the current Western style of living that rapidly developed over the 20th century fundamentally and dramatically reduced lifetime, and, crucially, early life exposure to environmental microorganisms, many of which would have normally become part of an individual's gut microbiota after they were born.

How could gut microbiota composition changes lead to selective allergies in specific individuals? Genetic predisposition should be taken as a given. However, natural history suggests that such predisposition transitioned to a full fledged clinical condition much more rarely in times past.

Let's briefly consider how that equation might have fundamentally changed in recent times. Consider indoor sanitation, piped chlorinated water, C-sections, milk formula, ultra-processed foods, lack of regular contact with farm animals (as a surrogate for nature) and profligate, ubiquitous, even excessive use of antimicrobial products such as antibiotics, to name just a few important factors.

Though some of these were beneficial in their own way, epidemiological data now suggests that such innovations in living conditions also disrupted the intimate association with the natural world that had been the norm for human societies since time immemorial. In the process such dramatic changes appear to have profoundly reduced human gut microbiota diversity among many, mostly in developed countries.

Unbeknownst to us, an epidemic of absence*, as Moises Velasquez-Manoff evocatively puts it, has thus been invisibly taking place across many human societies over the 20th century in lock-step with specific changes in living standards.

Such sudden and profound reduction in gut microbiota diversity thus emerges as the trigger that flips the normally hidden predisposition in some into clinically overt allergy. Actual mechanics of the process remain the subject of active research.

We (my colleague and I) propose a novel predictive mechanism for how disruption of regulatory T cell** function serves as the decisive and non-negotiable link between loss of specific microbiota and inflammatory disorders such as allergies. Time (and supporting data) will tell if we are right.

* An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases Reprint, Moises Velasquez-Manoff

** a small indispensable subset of CD4+ T cells.

This post originally appeared on Quora. Click here to view.

SECTIONS

arrow
LIVE SMARTER