How Tuberculosis Inspired the 19th-Century New England Vampire Panic

Jenkins via Flickr // CC BY-NC-ND 2.0
Jenkins via Flickr // CC BY-NC-ND 2.0

On March 19, 1892, the Evening Herald of Shenandoah, Pennsylvania printed a story describing what it called a “horrible superstition.”

A young man named Edwin Brown in Exeter, Rhode Island had been suffering from illness for some time. His mother and eldest sister had died from the same disease, then called “consumption” because of the way its victims wasted away (and now known as tuberculosis). Edwin traveled from Exeter to Colorado Springs—a popular destination due to its dry climate and specialized disease treatment centers—but his health did not improve. While he was away, his sister Mercy also became ill and quickly died.

When Edwin returned home after Mercy’s death, his health declined. His desperate father turned to an old folk belief: When members of the same family waste away from consumption, it could be because one of the deceased was draining the life force of their living relatives.

With a doctor and some neighbors in tow, Edwin and Mercy’s father exhumed the bodies of each family member who had died of the illness. He found skeletons in the graves of his wife and eldest daughter, and a doctor found Mercy’s remains, which had been interred for nine weeks and looked relatively normal in its decay.

However, liquid blood was found in Mercy’s heart and liver. Although the doctor said this was fairly standard and not a sign of the supernatural, the organs were removed and cremated before Mercy was reburied, just in case. But the exhumation and cremation did nothing for Edwin Brown’s disease: He died two months later.

Newspapers were quick to connect these folk rituals with vampire legends, especially those of Eastern Europe. Vampire stories from all over were printed on the front pages of 19th-century New England, describing similar rituals in distant locations. Like the New Englanders, people in remote parts of Europe were exhuming bodies when people fell ill, and burning or planting stakes in those that seemed too full of life.

But the New Englanders who took part in these rituals didn’t necessarily believe there was a supernatural cause of their family members’ illness, as author and folklorist Michael E. Bell writes in his book Food for the Dead. Although some may have harbored beliefs about vampires, many were simply desperate, and unwilling to leave untried any remedy that might save the lives of those they loved—even an outlandish or gruesome method.

Tuberculosis was entrenched in the Americas even before the United States existed as a country. President George Washington himself likely fought the disease after contracting it from his brother—ironically, on a trip taken to Barbados in an attempt to treat Lawrence Washington’s illness, according to medical historian Howard Markel of the University of Michigan.

Washington wasn’t alone. Other notable American sufferers of tuberculosis included James Monroe, Ralph Waldo Emerson, Henry David Thoreau, Washington Irving, John “Doc” Holliday, and Helen Hunt Jackson.

In 1786, when health officials first began recording mortality rates connected to the deadly infection, Massachusetts alone recorded 300 consumption deaths for every 100,000 residents. Between that year and 1800, tuberculosis killed 2 percent of New England’s population. In many cases, living in the same home was enough for the disease to spread throughout an entire family. It was estimated that anywhere from 70 to 90 percent of the U.S. population had latent or active tuberculosis infections.

Today, most people understand that tuberculosis is spread through the air, by breathing in bacteria coughed up by people with active infections in their lungs or throats. There are vaccines, though they’re rarely used in the U.S., and treatments for those who contract active tuberculosis infections.

In the 1800s, however, germ theory was only just beginning to gain supporters among the medical community. Doctors were still arguing over the causes of tuberculosis in 1895, and treatment mainly consisted of leaving large cities like New York and Boston, where the disease ran rampant, for places like Pasadena, California and Colorado Springs, where the climate was supposed to help ease the symptoms. Until the rise of the sanatoria movement (basically, rest-oriented treatment centers) at the end of the 19th century, few medical treatments worked. Even sanatoria only helped some patients.

As tuberculosis spread from the cities out into the countryside, people didn’t know what caused it or how to stop it. In some New England towns, such as Lynn, Massachusetts, it was the leading cause of death, Bell says. Entire families were wiped out, and there didn’t seem to be any rhyme or reason to who caught the illness.

It was not a pleasant way to die. Symptoms included wasting, night sweats, and fatigue, and a persistent cough that sometimes produced white phlegm or foamy blood. Occasionally, the cough turned into hemorrhaging. Those who caught it could not know if they would eventually recover, painfully waste away over the course of years, or die in a matter of months from the “galloping” form of the disease. If they did recover, there was always the fear that the illness would return.

“Cholera, plague, smallpox, yellow fever, influenza, and measles were fast-burning epidemics that appeared, killed, and then went dormant as immunities kicked in,” Bell says. Tuberculosis did not. It was an unrelenting fact of life in the 1800s. With no other explanations, people turned to the supernatural to understand the epidemic, and to offer hope of a cure.

Enter the vampire.

The vampire legend may have made its way into New England as an early version of the unproven “miracle cure” for tuberculosis. In 1784, a newspaper published a letter about a foreign “quack doctor” who had been spreading an unusual cure for consumption. According to the letter, when a third member of the Willington, Connecticut family of Isaac Johnson contracted the disease, the quack doctor advised him to dig up two family members who had already died of the illness. The bodies were inspected for any sprouting plants, and the letter writer—who said he was an eyewitness—reported that sorrel was found. The doctor advised the Johnson family to burn the sorrel with the vital organs to remove sickness from his family, an idea the letter-writer called an imposture.

But those who had lost multiple loved ones, and faced losing more, were willing to try anyway.

Anthropologist George R. Stetson later connected the New England beliefs to similar rituals from Russia, Hungary, Prussia, and Serbia, as well as other parts of Europe, ancient Greece and the Caribbean. In his 1896 article The Animistic Vampire in New England, Stetson described the case of one unnamed mason who credited his own health to the ritual. The man had two brothers who had contracted tuberculosis. When the first died, a respected member of the community suggested the family burn his vital organs to save the second brother. The second brother protested and the ritual wasn't done; he continued to sicken and die. When the mason got sick, the second brother was exhumed, and “living blood” was found. A cremation was held (it’s unclear if it was just the blood or the full body that was burned), and the mason soon recovered.

New England vampires were not the supernatural revenants of novels like Dracula, who rose from the dead as walking corpses to drain blood from the living, Bell told mental_floss. Instead, they were believed to drain the life force of their loved ones through some spiritual connection that continued even after death.

“The ‘vampires’ in the New England tradition were not the reanimated corpses, bodily leaving their graves to suck the blood of living relatives, that we know from European folklore, filtered through Gothic literature and popular culture,” Bell says. “New England’s ‘microbes with fangs’ (as one medical practitioner recently termed them) were, however, just as fearful and deadly as the fictional Dracula.”

If a body was exhumed and liquid blood could be found, or it seemed to be far better preserved than expected, one of a number of rituals were performed, including burning the corpse (and sometimes inhaling the smoke); rearranging the corpse or turning it upside down and reburying it; or burning vital organs like the heart and liver. Occasionally, Bell says, the ashes were consumed by family members afflicted with tuberculosis.

One of the more remarkable cases Bell has discovered is that of the Rev. Justus Forward and his daughter Mercy (no relation to Mercy Brown). In 1788, the minister had already lost three daughters to consumption; Mercy and another sister were fighting the illness. As Mercy Forward traveled to a neighboring town with her father one day, she began to hemorrhage.

Forward was reluctant to try opening the graves of his deceased family members, but allowed himself to be convinced, willing to do anything to save his daughter. His mother-in-law’s grave was opened first, without result. However, he soon found a grave that fit the requirements. Bell relays a portion of a letter written by Forward:

“Since I had begun to search, I concluded to search further ... and this morning opened the grave of my daughter ... who had died—the last of my three daughters—almost six years ago ... On opening the body, the lungs were not dissolved, but had blood in them, though not fresh, but clotted. The lungs did not appear as we would suppose they would in a body just dead, but far nearer a state of soundness than could be expected. The liver, I am told, was as sound as the lungs. We put the lungs and liver in a separate box, and buried it in the same grave, ten inches or a foot, above the coffin.”

The act didn’t save Mercy, Bell says, but Forward’s other children seemed to recover. And the willingness of Forward and his family to attempt the ritual impartially helped to relieve fear in his community, Bell notes: “He ultimately authorized a ritual that, in effect, reestablished social stability, essentially proclaiming that the dead were, indeed, dead once again.”

There were other cases, too:

At the end of the 19th century, Daniel Ransom wrote in his memoirs about his brother Frederick, a Dartmouth College student who died of tuberculosis in 1817. The boys’ father worried that Frederick would feed on the rest of the family, and had Frederick exhumed and his heart burned at a blacksmith’s forge. The cure didn’t work, however, and Daniel Ransom lost his mother and three siblings over the next several years.

In the 1850s, Henry Ray of Jewett City, Connecticut dug up the bodies of his brothers and had them burned when he, too, contracted tuberculosis. In a nearby case, a grave belonging to someone known only as “J.B.” was broken into—possibly by family members or friends, who often conducted the rituals—and the skeletal remains were rearranged into a skull and crossbones shape. Researchers speculate that it might have been done to stop J.B. from becoming a vampire, or because he was blamed for a living person’s illness.

Henry David Thoreau wrote of another case in his journal in September 1859: “The savage in man is never quite eradicated. I have just read of a family in Vermont—who, several of its members having died of consumption, just burned the lungs & heart & liver of the last deceased, in order to prevent any more from having it.”

These tales found their way into newspapers throughout the U.S., along with European tales of vampires, werewolves, and witches, reflecting the late 19th century’s fascination with the afterlife and the supernatural. Such stories from New England may even have inspired Bram Stoker’s story of Dracula.

The rituals continued until Mercy Brown’s exhumation in 1892, 10 years after Robert Koch discovered the bacteria that caused tuberculosis. Eventually, germ theory began to take hold, and contagion was better understood. Infection rates began to go down as hygiene and nutrition improved.

But until then, people were often willing to cling to any chance for themselves and their loved ones under the “gnawing sense of hopelessness” those with the disease lived with, Bell says:

“In short, for the pragmatic Yankee, the bottom line was, ‘What do I have to do to stop this scourge?’ The ritual was a folk remedy rather than an elaborated detailed belief system."

The CDC Is Here to Ruin the Holidays By Reminding You Not to Eat Cookie Dough

iStock.com/YinYang
iStock.com/YinYang

The holidays are upon us and, right on schedule, the Centers for Disease Control and Prevention (CDC) has arrived to crush one of the small joys the season has to offer. As The Takeout reports, the CDC issued a statement recently reminding us to abstain from eating raw cookie dough while baking, no matter how great the temptation may be.

Cookie dough, though delicious, is unfortunately unsafe to consume any time of year. The dough contains raw eggs that can potentially harbor salmonella, a type of bacteria that causes fever, diarrhea, and abdominal pain. And the risk of salmonella poisoning isn't the only reason to avoid raw dough: Uncooked flour hasn't been treated to kill germs, which means it may be carrying E. coli. The bacteria—which induces symptoms similar to those seen with salmonella exposure—can stay dormant in flour for months and reactivate when it's mixed with eggs, oil, and water. The only way to make sure your cookies are safe to eat is by giving them plenty of time to bake in the oven.

It's widely known that sampling raw cookie dough comes with health risks, but some of us need an extra reminder ahead of holiday cookie swap season.

"There are many special occasions through the year that are perfect to spend time with loved ones while preparing delicious baked foods in the kitchen," the CDC said in its statement. "When you prepare homemade cookie dough, cake mixes, or even bread, you may be tempted to taste a bite before it is fully cooked. But steer clear of this temptation."

Cookies are so appealing in their uncooked form that there are entire businesses built around cookie dough that's purportedly safe to eat. New York and London are both home to cookie dough cafes, and in 2014, a company that sells edible dough by the tub found success on Shark Tank. If you don't have access to safe-to-eat dough this holiday season, there are plenty of fully-baked cookie options out there to choose from.

[h/t The Takeout]

14 Facts About Celiac Disease

iStock.com/fcafotodigital
iStock.com/fcafotodigital

Going gluten-free may be a modern diet trend, but people have been suffering from celiac disease—a chronic condition characterized by gluten intolerance—for centuries. Patients with celiac are ill-equipped to digest products made from certain grains containing gluten; wheat is the most common. In the short-term this can cause gastrointestinal distress, and in the long-term it can foster symptoms associated with early death.

Celiac diagnoses are more common than ever, which also means awareness of how to live with the condition is at an all-time high. Here are some things you might not know about celiac disease symptoms and treatments.

1. Celiac an autoimmune disease.

The bodies of people with celiac have a hostile reaction to gluten. When the protein moves through the digestive tract, the immune system responds by attacking the small intestine, causing inflammation that damages the lining of the organ. As this continues over time, the small intestine has trouble absorbing nutrients from other foods, which can lead to additional complications like anemia and osteoporosis.

2. You can get celiac disease from your parents.

Nearly all cases of celiac disease arise from certain variants of the genes HLA-DQA1 and HLA-DQB1. These genes help produce proteins in the body that allow the immune system to identify potentially dangerous foreign substances. Normally the immune system wouldn't label gliadin, a segment of the gluten protein, a threat, but due to mutations in these genes, the bodies of people with celiac treat gliadin as a hostile invader.

Because it's a genetic disorder, people with a first-degree relative (a sibling, parent, or child) with celiac have a 4 to 15 percent chance of having it themselves. And while almost all patients with celiac have these specific HLA-DQA1 and HLA-DQB1 variations, not everyone with the mutations will develop celiac. About 30 percent of the population has these gene variants, and only 3 percent of that group goes on to develop celiac disease.

3. Makeup might contribute to celiac disease symptoms.

People with celiac disease can’t properly process gluten, the protein naturally found in the grains like wheat, rye, and barley. Patients have to follow strict dietary guidelines and avoid most bread, pasta, and cereal, in order to manage their symptoms. But gluten isn’t limited to food products: It can also be found in some cosmetics. While makeup containing gluten causes no issues for many people with celiac, it can provoke rashes in others or lead to more problems if ingested. For those folks, gluten-free makeup is an option.

4. The name comes from 1st-century Greece.

A 1st-century Greek physician named Aretaeus of Cappadocia may have been the first person to describe celiac disease symptoms in writing [PDF]. He named it koiliakos after the Greek word koelia for abdomen, and he referred to people with the condition as coeliacs. In his description he wrote, “If the stomach be irretentive of the food and if it pass through undigested and crude, and nothing ascends into the body, we call such persons coeliacs.”

5. There are nearly 300 celiac disease symptoms.

Celiac disease may start in the gut, but it can be felt throughout the whole body. In children, the condition usually manifests as bloating, diarrhea, and abdominal discomfort, but as patients get older they start to experience more “non-classical” symptoms like anemia, arthritis, and fatigue. There are at least 281 symptoms associated with celiac disease, many of which overlap with other conditions and make celiac hard to diagnose. Other common symptoms of the disease include tooth discoloration, anxiety and depression, loss of fertility, and liver disorders. Celiac patients also have a greater chance of developing an additional autoimmune disorder, with the risk increasing the later in life the initial condition is diagnosed.

6. Some patients show no symptoms at all.

It’s not uncommon for celiac disease to be wrecking a patient’s digestive tract while showing no apparent symptoms. This form of the condition, sometimes called asymptomatic or “silent celiac disease,” likely contributes to part of the large number of people with celiac who are undiagnosed. People who are at high risk for the disease (the children of celiac sufferers, for example), or who have related conditions like type 1 diabetes and Down syndrome (both conditions that put patients at a greater risk for developing new autoimmune diseases) are encouraged to get tested for it even if they aren’t showing any signs.

7. It’s not the same as wheat sensitivity.

Celiac is often confused with wheat sensitivity, a separate condition that shares many symptoms with celiac, including gastrointestinal issues, depression, and fatigue. It’s often called gluten sensitivity or gluten intolerance, but because doctors still aren’t sure if gluten is the cause, many refer to it as non-celiac wheat sensitivity. There’s no test for it, but patients are often treated with the same gluten-free diet that’s prescribed to celiac patients.

8. It's not a wheat allergy either.

Celiac disease is often associated with wheat because it's one of the more common products containing gluten. While it's true that people with celiac can't eat wheat, the condition isn't a wheat allergy. Rather than reacting to the wheat, patients react to a specific protein that's found in the grain as well as others.

9. It can develop at any age.

Just because you don’t have celiac now doesn’t mean you’re in the clear for life: The disease can develop at any age, even in people who have tested negative for it previously. There are, however, two stages of life when symptoms are most likely to appear: early childhood (8 to 12 months) and middle adulthood (ages 40 to 60). People already genetically predisposed to celiac become more susceptible to it when the composition of their intestinal bacteria changes as they get older, either as a result of infection, surgery, antibiotics, or stress.

10. Not all grains are off-limits.

A gluten-free diet isn’t necessarily a grain-free diet. While it’s true that the popular grains wheat, barley, and rye contain gluten, there are plenty of grains and seeds that don’t and are safe for people with celiac to eat. These include quinoa, millet, amaranth, buckwheat, sorghum, and rice. Oats are also naturally gluten-free, but they're often contaminated with gluten during processing, so consumers with celiac should be cautious when buying them.

11. Celiac disease can be detected with a blood test.

Screenings for celiac disease used to be an involved process, with doctors monitoring patients’ reactions to their gluten-free diet over time. Today all it takes is a simple test to determine whether someone has celiac. People with the condition will have anti-tissue transglutaminase antibodies in their bloodstream. If a blood test confirms the presence of these proteins in a patient, doctors will then take a biopsy of their intestine to confirm the root cause.

12. The gluten-free diet doesn’t work for all patients.

Avoiding gluten is the most effective way to manage celiac disease, but the treatment doesn’t work 100 percent of the time. In up to a fifth of patients, the damaged intestinal lining does not recover even a year after switching to a gluten-free diet. Most cases of non-responsive celiac disease can be explained by people not following the diet closely enough, or by having other conditions like irritable bowel syndrome, lactose intolerance, or small intestine bacterial overgrowth that impede recovery. Just a small fraction of celiac disease sufferers don’t respond to a strict gluten-free diet and have no related conditions. These patients are usually prescribed steroids and immunosuppressants as alternative treatments.

13. If you don’t have celiac, gluten probably won’t hurt you.

The gluten-free diet trend has exploded in popularity in recent years, and most people who follow it have no medical reason to do so. Going gluten-free has been purported to do everything from help you lose weight to treat autism—but according to doctors, there’s no science behind these claims. Avoiding gluten may help some people feel better and more energetic because it forces them to cut heavily processed junk foods out of their diet. In such cases it’s the sugar and carbs that are making people feel sluggish—not the gluten protein. If you don’t have celiac or a gluten sensitivity, most experts recommend saving yourself the trouble by eating healthier in general rather than abstaining from gluten.

14. The numbers are growing.

A 2009 study found that four times as many people have celiac today than in the 1950s, and the spike can’t be explained by increased awareness alone. Researchers tested blood collected at the Warren Air Force Base between 1948 and 1954 and compared them to fresh samples from candidates living in one Minnesota county. The results supported the theory that celiac has become more prevalent in the last half-century. While experts aren’t exactly sure why the condition is more common today, it may have something to do with changes in how wheat is handled or the spread of gluten into medications and processed foods.

SECTIONS

arrow
LIVE SMARTER