Did William Henry Harrison Really Die of Pneumonia?

James Lambdin, The White House Historical Association, Public Domain, Wikimedia Commons
James Lambdin, The White House Historical Association, Public Domain, Wikimedia Commons

Whether you learned it in school, or through a jaunty musical number on The Simpsons, the sad tale of William Henry Harrison is one of the more unique in American history. Before being elected the ninth President of the United States in 1840, Harrison was known as a military hero who led his troops to victory against an attack from the Native American confederacy in 1811, later known as the Battle of Tippecanoe. His heroics extended into the War of 1812, when he recovered Detroit from the British and won the Battle of Thames.

Military notoriety has often given way to a road into politics, especially in the 19th century. Harrison was soon elected a senator for Ohio, and then eventually became president after beating incumbent president Martin van Buren in 1840. At 67 years old, Harrison took office as the oldest president to ever be elected—a record that would stand until Ronald Reagan's election in 1980 at 69 years old. Despite the cold, rainy weather in Washington D.C. on inauguration day, Harrison stood in front of the masses without his overcoat, hat, and gloves, and gave an 8445-word speech that would last almost two hours. Three weeks later, Harrison complained of fatigue and of a cold, which later turned into what doctors called pneumonia. On April 4, 1841—exactly one month after taking office—Harrison was dead.

The historical narrative virtually wrote itself: Harrison, after being improperly dressed for the weather, got pneumonia and would go down as a cautionary tale (or a punch line) and as having the shortest presidency on record. But was it really pneumonia that killed him? Harrison's own doctor, Thomas Miller, was skeptical. He wrote:

“The disease was not viewed as a case of pure pneumonia; but as this was the most palpable affection, the term pneumonia afforded a succinct and intelligible answer to the innumerable questions as to the nature of the attack.”

While revisiting the case a few years ago, writer Jane McHugh and Dr. Philip A. Mackowiak of the University of Maryland School of Medicine came up with a new diagnosis after looking at the evidence through the lens of modern medicine: enteric fever, also known as typhoid fever. They detailed their findings in the journal Clinical Infectious Diseases [PDF] and for The New York Times.

Before 1850, Washington D.C.'s sewage was dumped in a marsh just seven blocks upstream from the executive mansion's water supply. McHugh and Mackowiak hypothesize that Harrison was exposed to bacteria—namely Salmonella typhi or S. paratyphi—which could cause enteric fever. Harrison also apparently had a history of severe indigestion, which could have made him more susceptible to such intestinal distress. While treating Harrison, Miller also administered opium and enemas, both of which would cause more harm than good to someone in Harrison's condition.

Harrison would not have been the only person to be afflicted with a gastrointestinal illness while occupying the presidency in this time period. Both James K. Polk and Zachary Taylor, according to McHugh and Mackowiak, suffered through severe gastroenteritis, and the duo theorizes it was the same enteric fever as Harrison's. Polk recovered, while Taylor died in office of his illness, less than 10 years after Harrison's death.

Though Harrison's insistence on soldiering through his lengthy, bitterly cold inauguration while dressed in his finest spring wear wasn't a high point in presidential common sense, there's plenty of scientific evidence to suggest that it didn't contribute to the shortest presidency in American history.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Why Do We Eat Candy on Halloween?

Jupiterimages/iStock via Getty Images
Jupiterimages/iStock via Getty Images

On October 31, hordes of children armed with Jack-o'-lantern-shaped buckets and pillow cases will take to the streets in search of sugar. Trick-or-treating for candy is synonymous with Halloween, but the tradition had to go through a centuries-long evolution to arrive at the place it is today. So how did the holiday become an opportunity for kids to get free sweets? You can blame pagans, Catholics, and candy companies.

Historians agree that a Celtic autumn festival called Samhain was the precursor to modern Halloween. Samhain was a time to celebrate the last harvest of the year and the approach of the winter season. It was also a festival for honoring the dead. One way Celtics may have appeased the spirits they believed still walked the Earth was by leaving treats on their doorsteps.

When Catholics infiltrated Ireland in the 1st century CE, they rebranded many pagan holidays to fit their religion. November 1 became the “feasts of All Saints and All Souls," and the day before it was dubbed "All-Hallows'-Eve." The new holidays looked a lot different from the original Celtic festival, but many traditions stuck around, including the practice of honoring the dead with food. The food of choice for Christians became "soul cakes," small pastries usually baked with expensive ingredients and spices like currants and saffron.

Instead of leaving them outside for passing ghosts, soul cakes were distributed to beggars who went door-to-door promising to pray for souls of the deceased in exchange for something to eat. Sometimes they wore costumes to honor the saints—something pagans originally did to avoid being harassed by evil spirits. The ritual, known as souling, is believed to have planted the seeds for modern-day trick-or-treating.

Souling didn't survive the holiday's migration from Europe to the United States. In America, the first Halloween celebrations were a way to mark the end-of-year harvest season, and the food that was served mainly consisted of homemade seasonal treats like caramel apples and mixed nuts. There were no soul cakes—or candies, for that matter—to be found.

It wasn't until the 1950s that trick-or-treating gained popularity in the U.S. Following the Great Depression and World War II, the suburbs were booming, and people were looking for excuses to have fun and get to know their neighbors. The old practice of souling was resurrected and made into an excuse for kids to dress up in costumes and roam their neighborhoods. Common trick-or-treat offerings included nuts, coins, and homemade baked goods ("treats" that most kids would turn their noses up at today).

That changed when the candy companies got their hands on the holiday. They had already convinced consumers that they needed candy on Christmas and Easter, and they were looking for an equally lucrative opportunity to market candy in the fall. The new practice of trick-or-treating was almost too good to be true. Manufacturers downsized candies into smaller, bite-sized packages and began marketing them as treats for Halloween. Adults were grateful to have a convenient alternative to baking, kids loved the sweet treats, and the candy companies made billions.

Today, it's hard to imagine Halloween without Skittles, chocolate bars, and the perennial candy corn debates. But when you're digging through a bag or bowl of Halloween candy this October, remember that you could have been having eating soul cakes instead.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

What's the Difference Between Cement and Concrete?

Vladimir Kokorin/iStock via Getty Images
Vladimir Kokorin/iStock via Getty Images

Picture yourself walking down a city block. The sidewalk you follow may be obscured by shuffling feet and discarded gum, but it’s clearly made from something hard, smooth, and gray. What may be less clear is the proper name for that material: Is it concrete or cement? Is there even a real difference between the two words?

Though they’re often used interchangeably, concrete and cement describe different yet related elements of the blocks, flooring, and walls that make up many everyday structures. In simple terms, concrete is the name of the gray, gritty building material used in construction, and cement is an ingredient used in concrete.

Cement is a dry powder mixture that looks much different from the wet stuff poured out of so-called cement trucks. It’s made from minerals that have been crushed up and mixed together. Exactly what kind of minerals it’s made from varies: Limestone and clay are commonly used today, but anything from seashells to volcanic ash is suitable. After the ingredients are mixed together the first time, they’re fired in a kiln at 2642°F to form strong new compounds, then cooled, crushed, and combined again.

Cement
Cement
lior2/iStock via Getty Images

This mixture is useless on its own. Before it’s ready to be used in construction projects, the cement must be mixed with water and an aggregate, such as sand, to form a moldable paste. This substance is known as concrete. It fills whatever mold it’s poured into and quickly hardens into a solid, rock-like form, which is partly why it’s become the most widely-used building material on Earth.

So whether you’re etching your initials into a wet sidewalk slab, power-hosing your back patio, or admiring some Brutalist architecture, you’re dealing with concrete. But if you ever happen to be handling a chalky gray powder that hasn’t been mixed with water, cement is the correct label to use.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER