Why Does Humidity Make Us Feel Hotter?

Tomwang112/iStock via Getty Images
Tomwang112/iStock via Getty Images

With temperatures spiking around the country, we thought it might be a good time to answer some questions about the heat index—and why humidity makes us feel hotter.

Why does humidity make us feel hotter?

To answer that question, we need to talk about getting sweaty.

As you probably remember from your high school biology class, one of the ways our bodies cool themselves is by sweating. The sweat then evaporates from our skin, and it carries heat away from the body as it leaves.

Humidity throws a wrench in that system of evaporative cooling, though. As relative humidity increases, the evaporation of sweat from our skin slows down. Instead, the sweat just drips off of us, which leaves us with all of the stinkiness and none of the cooling effect. Thus, when the humidity spikes, our bodies effectively lose a key tool that could normally be used to cool us down.

What's relative about relative humidity?

We all know that humidity refers to the amount of water contained in the air. However, as the air’s temperature changes, so does the amount of water the air can hold. (Air can hold more water vapor as the temperature heats up.) Relative humidity compares the actual humidity to the maximum amount of water vapor the air can hold at any given temperature.

Whose idea was the heat index?

While the notion of humidity making days feel warmer is painfully apparent to anyone who has ever been outside on a soupy day, our current system owes a big debt to Robert G. Steadman, an academic textile researcher. In a 1979 research paper called, “An Assessment of Sultriness, Parts I and II,” Steadman laid out the basic factors that would affect how hot a person felt under a given set of conditions, and meteorologists soon used his work to derive a simplified formula for calculating heat index.

The formula is long and cumbersome, but luckily it can be transformed into easy-to-read charts. Today your local meteorologist just needs to know the air temperature and the relative humidity, and the chart will tell him or her the rest.

Is the heat index calculation the same for everyone?

Not quite, but it’s close. Steadman’s original research was founded on the idea of a “typical” person who was outdoors under a very precise set of conditions. Specifically, Steadman’s everyman was 5’7” tall, weighed 147 pounds, wore long pants and a short-sleeved shirt, and was walking at just over three miles per hour into a slight breeze in the shade. Any deviations from these conditions will affect how the heat/humidity combo feels to a certain person.

What difference does being in the shade make?

Quite a big one. All of the National Weather Service’s charts for calculating the heat index make the reasonable assumption that folks will look for shade when it’s oppressively hot and muggy out. Direct sunlight can add up to 15 degrees to the calculated heat index.

How does wind affect how dangerous the heat is?

Normally, when we think of wind on a hot day, we think of a nice, cooling breeze. That’s the normal state of affairs, but when the weather is really, really hot—think high-90s hot—a dry wind actually heats us up. When it’s that hot out, wind actually draws sweat away from our bodies before it can evaporate to help cool us down. Thanks to this effect, what might have been a cool breeze acts more like a convection oven.

When should I start worrying about high heat index readings?

The National Weather Service has a handy four-tiered system to tell you how dire the heat situation is. At the most severe level, when the heat index is over 130, that's classified as "Extreme Danger" and the risk of heat stroke is highly likely with continued exposure. Things get less scary as you move down the ladder, but even on "Danger" days, when the heat index ranges from 105 to 130, you probably don’t want to be outside. According to the service, that’s when prolonged exposure and/or physical activity make sunstroke, heat cramps, and heat exhaustion likely, while heat stroke is possible.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

This article has been updated for 2019.

Why Do We Eat Candy on Halloween?

Jupiterimages/iStock via Getty Images
Jupiterimages/iStock via Getty Images

On October 31, hordes of children armed with Jack-o'-lantern-shaped buckets and pillow cases will take to the streets in search of sugar. Trick-or-treating for candy is synonymous with Halloween, but the tradition had to go through a centuries-long evolution to arrive at the place it is today. So how did the holiday become an opportunity for kids to get free sweets? You can blame pagans, Catholics, and candy companies.

Historians agree that a Celtic autumn festival called Samhain was the precursor to modern Halloween. Samhain was a time to celebrate the last harvest of the year and the approach of the winter season. It was also a festival for honoring the dead. One way Celtics may have appeased the spirits they believed still walked the Earth was by leaving treats on their doorsteps.

When Catholics infiltrated Ireland in the 1st century CE, they rebranded many pagan holidays to fit their religion. November 1 became the “feasts of All Saints and All Souls," and the day before it was dubbed "All-Hallows'-Eve." The new holidays looked a lot different from the original Celtic festival, but many traditions stuck around, including the practice of honoring the dead with food. The food of choice for Christians became "soul cakes," small pastries usually baked with expensive ingredients and spices like currants and saffron.

Instead of leaving them outside for passing ghosts, soul cakes were distributed to beggars who went door-to-door promising to pray for souls of the deceased in exchange for something to eat. Sometimes they wore costumes to honor the saints—something pagans originally did to avoid being harassed by evil spirits. The ritual, known as souling, is believed to have planted the seeds for modern-day trick-or-treating.

Souling didn't survive the holiday's migration from Europe to the United States. In America, the first Halloween celebrations were a way to mark the end-of-year harvest season, and the food that was served mainly consisted of homemade seasonal treats like caramel apples and mixed nuts. There were no soul cakes—or candies, for that matter—to be found.

It wasn't until the 1950s that trick-or-treating gained popularity in the U.S. Following the Great Depression and World War II, the suburbs were booming, and people were looking for excuses to have fun and get to know their neighbors. The old practice of souling was resurrected and made into an excuse for kids to dress up in costumes and roam their neighborhoods. Common trick-or-treat offerings included nuts, coins, and homemade baked goods ("treats" that most kids would turn their noses up at today).

That changed when the candy companies got their hands on the holiday. They had already convinced consumers that they needed candy on Christmas and Easter, and they were looking for an equally lucrative opportunity to market candy in the fall. The new practice of trick-or-treating was almost too good to be true. Manufacturers downsized candies into smaller, bite-sized packages and began marketing them as treats for Halloween. Adults were grateful to have a convenient alternative to baking, kids loved the sweet treats, and the candy companies made billions.

Today, it's hard to imagine Halloween without Skittles, chocolate bars, and the perennial candy corn debates. But when you're digging through a bag or bowl of Halloween candy this October, remember that you could have been having eating soul cakes instead.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

What's the Difference Between Cement and Concrete?

Vladimir Kokorin/iStock via Getty Images
Vladimir Kokorin/iStock via Getty Images

Picture yourself walking down a city block. The sidewalk you follow may be obscured by shuffling feet and discarded gum, but it’s clearly made from something hard, smooth, and gray. What may be less clear is the proper name for that material: Is it concrete or cement? Is there even a real difference between the two words?

Though they’re often used interchangeably, concrete and cement describe different yet related elements of the blocks, flooring, and walls that make up many everyday structures. In simple terms, concrete is the name of the gray, gritty building material used in construction, and cement is an ingredient used in concrete.

Cement is a dry powder mixture that looks much different from the wet stuff poured out of so-called cement trucks. It’s made from minerals that have been crushed up and mixed together. Exactly what kind of minerals it’s made from varies: Limestone and clay are commonly used today, but anything from seashells to volcanic ash is suitable. After the ingredients are mixed together the first time, they’re fired in a kiln at 2642°F to form strong new compounds, then cooled, crushed, and combined again.

Cement
Cement
lior2/iStock via Getty Images

This mixture is useless on its own. Before it’s ready to be used in construction projects, the cement must be mixed with water and an aggregate, such as sand, to form a moldable paste. This substance is known as concrete. It fills whatever mold it’s poured into and quickly hardens into a solid, rock-like form, which is partly why it’s become the most widely-used building material on Earth.

So whether you’re etching your initials into a wet sidewalk slab, power-hosing your back patio, or admiring some Brutalist architecture, you’re dealing with concrete. But if you ever happen to be handling a chalky gray powder that hasn’t been mixed with water, cement is the correct label to use.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER