Does Having Allergies Mean That You Have A Decreased Immunity?

iStock.com/PeopleImages
iStock.com/PeopleImages

Tirumalai Kamala:

No, allergy isn't a sign of decreased immunity. It is a specific type of immune dysregulation. Autoimmunity, inflammatory disorders such as IBS and IBD, and even cancer are examples of other types of immune dysregulation.

Quality and target of immune responses and not their strength is the core issue in allergy. Let's see how.

—Allergens—substances known to induce allergy—are common. Some such as house dust mite and pollen are even ubiquitous.
—Everyone is exposed to allergens yet only a relative handful are clinically diagnosed with allergy.
—Thus allergens don't inherently trigger allergy. They can but only in those predisposed to allergy, not in everyone.
—Each allergic person makes pathological immune responses to not all but to only one or a few structurally related allergens while the non-allergic don't.
—Those diagnosed with allergy aren't necessarily more susceptible to other diseases.

If the immune response of each allergic person is selectively distorted when responding to specific allergens, what makes someone allergic? Obviously a mix of genetic and environmental factors.

[The] thing is allergy prevalence has spiked in recent decades, especially in developed countries, [which is] too short a time period for purely genetic mutation-based changes to be the sole cause, since that would take multiple generations to have such a population-wide effect. That tilts the balance towards environmental change, but what specifically?

Starting in the 1960s, epidemiologists began reporting a link between infections and allergy—[the] more infections in childhood, [the] less the allergy risk [this is called hygiene hypothesis]. Back then, microbiota weren't even a consideration but now we have learned better, so the hygiene hypothesis has expanded to include them.

Essentially, the idea is that the current Western style of living that rapidly developed over the 20th century fundamentally and dramatically reduced lifetime, and, crucially, early life exposure to environmental microorganisms, many of which would have normally become part of an individual's gut microbiota after they were born.

How could gut microbiota composition changes lead to selective allergies in specific individuals? Genetic predisposition should be taken as a given. However, natural history suggests that such predisposition transitioned to a full fledged clinical condition much more rarely in times past.

Let's briefly consider how that equation might have fundamentally changed in recent times. Consider indoor sanitation, piped chlorinated water, C-sections, milk formula, ultra-processed foods, lack of regular contact with farm animals (as a surrogate for nature) and profligate, ubiquitous, even excessive use of antimicrobial products such as antibiotics, to name just a few important factors.

Though some of these were beneficial in their own way, epidemiological data now suggests that such innovations in living conditions also disrupted the intimate association with the natural world that had been the norm for human societies since time immemorial. In the process such dramatic changes appear to have profoundly reduced human gut microbiota diversity among many, mostly in developed countries.

Unbeknownst to us, an epidemic of absence*, as Moises Velasquez-Manoff evocatively puts it, has thus been invisibly taking place across many human societies over the 20th century in lock-step with specific changes in living standards.

Such sudden and profound reduction in gut microbiota diversity thus emerges as the trigger that flips the normally hidden predisposition in some into clinically overt allergy. Actual mechanics of the process remain the subject of active research.

We (my colleague and I) propose a novel predictive mechanism for how disruption of regulatory T cell** function serves as the decisive and non-negotiable link between loss of specific microbiota and inflammatory disorders such as allergies. Time (and supporting data) will tell if we are right.

* An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases Reprint, Moises Velasquez-Manoff

** a small indispensable subset of CD4+ T cells.

This post originally appeared on Quora. Click here to view.

Why Do People Get Ice Cream Headaches?

CharlieAJA, istock/getty images plus
CharlieAJA, istock/getty images plus

Reader Susann writes in to ask, "What exactly is the cause of brain freeze?"

You may know an ice cream headache by one of its other names: brain freeze, a cold-stimulus headache, or sphenopalatine ganglioneuralgia ("nerve pain of the sphenopalatine ganglion"). But no matter what you call it, it hurts like hell.

Brain freeze is brought on by the speedy consumption of cold beverages or food. According to Dr. Joseph Hulihan—a principal at Paradigm Neuroscience and former associate professor in the Department of Neurology at the Temple University Health Sciences Center, ice cream is a very common cause of head pain, with about one third of a randomly selected population succumbing to ice cream headaches.

What Causes That Pain?

As far back as the late 1960s, researchers pinned the blame on the same vascular mechanisms—rapid constriction and dilation of blood vessels—that were responsible for the aura and pulsatile pain phases of migraine headaches. When something cold like ice cream touches the roof of your mouth, there is a rapid cooling of the blood vessels there, causing them to constrict. When the blood vessels warm up again, they experience rebound dilation. The dilation is sensed by pain receptors and pain signals are sent to the brain via the trigeminal nerve. This nerve (also called the fifth cranial nerve, the fifth nerve, or just V) is responsible for sensation in the face, so when the pain signals are received, the brain often interprets them as coming from the forehead and we perceive a headache.

With brain freeze, we're perceiving pain in an area of the body that's at a distance from the site of the actual injury or reception of painful stimulus. This is a quirk of the body known as referred pain, and it's the reason people often feel pain in their neck, shoulders, and/or back instead of their chest during a heart attack.

To prevent brain freeze, try the following:

• Slow down. Eating or drinking cold food slowly allows one's mouth to get used to the temperature.

• Hold cold food or drink in the front part of your mouth and allow it to warm up before swallowing.

• Head north. Brain freeze requires a warm ambient temperature to occur, so it's almost impossible for it to happen if you're already cold.

This story has been updated for 2019.

Why Does Humidity Make Us Feel Hotter?

Tomwang112/iStock via Getty Images
Tomwang112/iStock via Getty Images

With temperatures spiking around the country, we thought it might be a good time to answer some questions about the heat index—and why humidity makes us feel hotter.

Why does humidity make us feel hotter?

To answer that question, we need to talk about getting sweaty.

As you probably remember from your high school biology class, one of the ways our bodies cool themselves is by sweating. The sweat then evaporates from our skin, and it carries heat away from the body as it leaves.

Humidity throws a wrench in that system of evaporative cooling, though. As relative humidity increases, the evaporation of sweat from our skin slows down. Instead, the sweat just drips off of us, which leaves us with all of the stinkiness and none of the cooling effect. Thus, when the humidity spikes, our bodies effectively lose a key tool that could normally be used to cool us down.

What's relative about relative humidity?

We all know that humidity refers to the amount of water contained in the air. However, as the air’s temperature changes, so does the amount of water the air can hold. (Air can hold more water vapor as the temperature heats up.) Relative humidity compares the actual humidity to the maximum amount of water vapor the air can hold at any given temperature.

Whose idea was the heat index?

While the notion of humidity making days feel warmer is painfully apparent to anyone who has ever been outside on a soupy day, our current system owes a big debt to Robert G. Steadman, an academic textile researcher. In a 1979 research paper called, “An Assessment of Sultriness, Parts I and II,” Steadman laid out the basic factors that would affect how hot a person felt under a given set of conditions, and meteorologists soon used his work to derive a simplified formula for calculating heat index.

The formula is long and cumbersome, but luckily it can be transformed into easy-to-read charts. Today your local meteorologist just needs to know the air temperature and the relative humidity, and the chart will tell him or her the rest.

Is the heat index calculation the same for everyone?

Not quite, but it’s close. Steadman’s original research was founded on the idea of a “typical” person who was outdoors under a very precise set of conditions. Specifically, Steadman’s everyman was 5’7” tall, weighed 147 pounds, wore long pants and a short-sleeved shirt, and was walking at just over three miles per hour into a slight breeze in the shade. Any deviations from these conditions will affect how the heat/humidity combo feels to a certain person.

What difference does being in the shade make?

Quite a big one. All of the National Weather Service’s charts for calculating the heat index make the reasonable assumption that folks will look for shade when it’s oppressively hot and muggy out. Direct sunlight can add up to 15 degrees to the calculated heat index.

How does wind affect how dangerous the heat is?

Normally, when we think of wind on a hot day, we think of a nice, cooling breeze. That’s the normal state of affairs, but when the weather is really, really hot—think high-90s hot—a dry wind actually heats us up. When it’s that hot out, wind actually draws sweat away from our bodies before it can evaporate to help cool us down. Thanks to this effect, what might have been a cool breeze acts more like a convection oven.

When should I start worrying about high heat index readings?

The National Weather Service has a handy four-tiered system to tell you how dire the heat situation is. At the most severe level, when the heat index is over 130, that's classified as "Extreme Danger" and the risk of heat stroke is highly likely with continued exposure. Things get less scary as you move down the ladder, but even on "Danger" days, when the heat index ranges from 105 to 130, you probably don’t want to be outside. According to the service, that’s when prolonged exposure and/or physical activity make sunstroke, heat cramps, and heat exhaustion likely, while heat stroke is possible.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

This article has been updated for 2019.

SECTIONS

arrow
LIVE SMARTER