Does Having Allergies Mean That You Have A Decreased Immunity?

iStock.com/PeopleImages
iStock.com/PeopleImages

Tirumalai Kamala:

No, allergy isn't a sign of decreased immunity. It is a specific type of immune dysregulation. Autoimmunity, inflammatory disorders such as IBS and IBD, and even cancer are examples of other types of immune dysregulation.

Quality and target of immune responses and not their strength is the core issue in allergy. Let's see how.

—Allergens—substances known to induce allergy—are common. Some such as house dust mite and pollen are even ubiquitous.
—Everyone is exposed to allergens yet only a relative handful are clinically diagnosed with allergy.
—Thus allergens don't inherently trigger allergy. They can but only in those predisposed to allergy, not in everyone.
—Each allergic person makes pathological immune responses to not all but to only one or a few structurally related allergens while the non-allergic don't.
—Those diagnosed with allergy aren't necessarily more susceptible to other diseases.

If the immune response of each allergic person is selectively distorted when responding to specific allergens, what makes someone allergic? Obviously a mix of genetic and environmental factors.

[The] thing is allergy prevalence has spiked in recent decades, especially in developed countries, [which is] too short a time period for purely genetic mutation-based changes to be the sole cause, since that would take multiple generations to have such a population-wide effect. That tilts the balance towards environmental change, but what specifically?

Starting in the 1960s, epidemiologists began reporting a link between infections and allergy—[the] more infections in childhood, [the] less the allergy risk [this is called hygiene hypothesis]. Back then, microbiota weren't even a consideration but now we have learned better, so the hygiene hypothesis has expanded to include them.

Essentially, the idea is that the current Western style of living that rapidly developed over the 20th century fundamentally and dramatically reduced lifetime, and, crucially, early life exposure to environmental microorganisms, many of which would have normally become part of an individual's gut microbiota after they were born.

How could gut microbiota composition changes lead to selective allergies in specific individuals? Genetic predisposition should be taken as a given. However, natural history suggests that such predisposition transitioned to a full fledged clinical condition much more rarely in times past.

Let's briefly consider how that equation might have fundamentally changed in recent times. Consider indoor sanitation, piped chlorinated water, C-sections, milk formula, ultra-processed foods, lack of regular contact with farm animals (as a surrogate for nature) and profligate, ubiquitous, even excessive use of antimicrobial products such as antibiotics, to name just a few important factors.

Though some of these were beneficial in their own way, epidemiological data now suggests that such innovations in living conditions also disrupted the intimate association with the natural world that had been the norm for human societies since time immemorial. In the process such dramatic changes appear to have profoundly reduced human gut microbiota diversity among many, mostly in developed countries.

Unbeknownst to us, an epidemic of absence*, as Moises Velasquez-Manoff evocatively puts it, has thus been invisibly taking place across many human societies over the 20th century in lock-step with specific changes in living standards.

Such sudden and profound reduction in gut microbiota diversity thus emerges as the trigger that flips the normally hidden predisposition in some into clinically overt allergy. Actual mechanics of the process remain the subject of active research.

We (my colleague and I) propose a novel predictive mechanism for how disruption of regulatory T cell** function serves as the decisive and non-negotiable link between loss of specific microbiota and inflammatory disorders such as allergies. Time (and supporting data) will tell if we are right.

* An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases Reprint, Moises Velasquez-Manoff

** a small indispensable subset of CD4+ T cells.

This post originally appeared on Quora. Click here to view.

How Is a Sunscreen's SPF Calculated?

Rawpixel/iStock via Getty Images
Rawpixel/iStock via Getty Images

I’m a pale person. A very pale person. Which means that during these hot summer months, I carry sunscreen with me at all times, and apply it liberally. But I’ve never really understood what those SPF numbers meant, so I asked some sun care to break it down for me—and to tell me how to best apply the stuff so that I can make it through the summer without looking like a lobster.

Soaking up the sun ... safely

SPF stands for Sun Protection Factor, and it indicates a sunscreen’s ability to block UVB rays. The concept was pioneered at the Coppertone Solar Research Center in 1972; in 1978, the FDA published an SPF method based on Coppertone’s system, according to Dr. David Leffell, chief of Dermatologic Surgery and Cutaneous Oncology at Yale.

The numbers themselves stand for the approximate measure of time a person who has applied the sunscreen can stay out in the sun without getting burned. Say you get burned after 20 minutes in the sun without sunscreen; if properly applied (and reapplied), SPF 30 will allow you to stay in the sun 30 times longer without burning than if you were wearing no protection at all. So, theoretically, you should have approximately 600 minutes, or 10 hours, in the sun. But it’s not an exact science because the amount of UV light that reaches us depends on a number of factors, including cloud cover, the time of day, and the reflection of UV rays off the ground, so it’s generally recommended that you reapply sunscreen every two hours (or even sooner).

What gives a sunscreen a higher SPF comes down to the product’s formulation. “It’s possible that an SPF 50 might contain slightly more of one or more sunscreen active ingredients to achieve that higher SPF,” Dr. Patricia Agin, president of Agin Suncare Consulting, says. “But it’s also possible that the SPF 50 might contain an additional active ingredient to help boost the SPF performance to SPF 50.”

No matter what SPF your sunscreen is, you’ll still get a burn if it’s not properly applied. So let’s go over how to do that.

How to apply sunscreen

First, make sure you have a water-resistant, broad spectrum sunscreen—which means that it protects against both UVB and UVA radiation—with an SPF of at least 30. “Typically, you don’t have to buy sunscreen that has an SPF higher than that unless you have very sun sensitive skin,” Leffell says. “That’s a very small percentage of the population.” (Redheads, people with light eyes, and those who turn pink after just a few minutes in the sun—you’ll want to load up on SPF above 30.)

Twenty minutes before you go out to the beach or the pool, begin to apply your sunscreen in an even coat. “Don’t apply it like icing on a cake,” Leffell says. “I see these patients and they’ve got the tops of their ears covered with thick, unevenly applied sunscreen, and that’s not a good sign.” Sunscreen sprays will easily give you that even coat you need.

Whether you’re using lotion or a spray, when it comes time to apply, Leffell recommends starting with your scalp and face, even if you plan on wearing a hat. “Make sure you’ve covered the ears and nose and under the eyes,” Leffell says. “Then, I would move down to the shoulders, and make sure that someone can apply the sunscreen on your back beyond the reach of your hands.”

Other areas that are important that you may forget to cover, but shouldn’t, are the tops of your feet, the backs of your hands, and your chest. “We see it all the time now—the v of the chest in women has become a socially and aesthetically huge issue when they are 50 and beyond. Because even though they can treat their faces with all sorts of cosmetics and procedures, the chest is much harder, and they are stuck with the face of a 40-year-old and the chest of a 60-year-old. You want to avoid that using sunscreen.”

Another important thing to keep in mind: Water-resistant doesn’t mean waterproof. “I always tell patients to reapply every couple of hours while you’re active outdoors," Leffell says, "and always reapply when you come out of the water or if you’ve been sweating a lot, regardless of whether the label says water resistant."

Determining whether or not you’ve succeeded in properly applying your sunscreen is easy: “You know you’re applying your sunscreen properly if, after the first time you’ve used it, you haven’t gotten a burn,” Leffell says.

Agin has a caveat, though: "It’s not a good idea to think of sunscreens only as a way to extend your time in the sun," she says. "One must also understand that even before becoming sunburned, your skin is receiving UV exposure that causes other damage to the skin. At the end of the 600 minutes, you will have accrued enough UV to cause a sunburn—one Minimal Erythema Dose or MED—but there is pre-MED damage done to skin cells’ DNA and to the skin’s supporting structure of collagen and elastin that is not visible and happens even before you sunburn. These types of damage can occur without sunburning. So you can’t measure all the damage done to your skin by only being concerned about sunburn."

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us atbigquestions@mentalfloss.com.

An earlier version of this post ran in 2014.

What's the Difference Between Ice Cream and Gelato?

iStock/Getty Images/zoff-photo
iStock/Getty Images/zoff-photo

'Tis the season for beach reads, tan lines, and ice-cold desserts. You know it's summer when going to the local ice cream or gelato shop becomes part of your daily routine. But, what exactly is the difference between these two frozen treats?

One of the key differences between the two is butterfat. While ice cream's main ingredients include milk, cream, sugar, and egg yolks, the secret to making gelato is to use much less cream and sometimes little to no egg yolk. This leads to a much smaller percentage of butterfat in gelato. The FDA rules say that ice cream cannot contain less than 10 percent milkfat (though it can go as high as 25 percent) while gelato, much like soft serve, stays in the 4- to 9-percent range.

The churning method for both also differs, which affects the treat's density. Ice cream is churned at a much faster pace, leading to more air being whipped into the mixture. Ice cream's higher butterfat content comes into play here—due to all of that milkfat, the mix absorbs the air more readily. Gelato, on the other hand, is churned at a slower pace and absorbs far less air, creating a much denser dessert.

You also might have noticed that the serving style for the two treats aren't the same, either. In order to get those perfectly stacked ice cream scoops on a cone, buckets of ice cream must be stored at around 0°F to maintain its consistency, while the softer gelato is stored at a warmer 10°F to 22°F. Ice cream is then scooped into fairly uniform balls with the round ice cream scooper, whereas a spade or paddle is best for molding gelato into mound in a cup or a cone.

You can't really go wrong with either gelato or ice cream on a sweltering summer day, but there is one more difference to keep in mind while you debate which to get: taste. If you want a bolder flavor, you'll want to go with gelato. Because of the density of the cream and because there's less butterfat to coat your taste buds, gelato can seem to have more intensity to its flavors.

Have you got a Big Question you'd like us to answer? If so, send it to bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER