Sorry, Barflies: Dry January Isn't a Fix For Heavy Drinking

iStock
iStock

Briefly cutting back on alcohol can save money, and it might even temporarily improve your sleep or help you lose weight. But as Inside Science reports, Dry January—the increasingly popular practice of giving up booze for the entire first month of the year—might not confer any lasting health benefits if you're planning on hitting the bars again come February.

Researchers say there's just not enough data to gauge whether short-term abstinence pays off in the long run. In fact, studies have indicated that people who are forced to stop drinking for periods of time (such as military recruits) end up overdoing it after they're allowed to imbibe again. And going booze-free affects people differently based on age, gender, genetics, and drinking habits.

That said, volunteer abstinence—with plenty of social support—could prompt positive change. Richard de Visser, a psychologist at the University of Sussex in England, published a study in 2016 in the journal Health Psychology based on follow-up questionnaires answered by Dry January participants. The surveys revealed that many people actually ended up drinking less overall, even after their alcohol hiatus was over.

"Even if participants took part but didn't successfully complete the 31 days, it generally led to a significant decrease across all the measures of alcohol intake," de Visser told BBC News. (Critics of the study pointed out that its participants belonged to a self-selecting group that had successfully scaled down their alcohol consumption.)

Meanwhile, a mini-experiment conducted by New Scientist journalists in 2013 challenged the notion that short-term sobriety doesn't pay off. After ultrasounds and blood tests, 10 staffers gave up booze for five weeks, while four continued drinking as they always had. Tests done after the experiment showed that the abstainer participants' liver fat, a precursor to liver disease, had fallen by an average of 15 percent, and their total cholesterol and blood glucose levels had also dropped. They also lost weight and reported better sleep quality.

Experts said they didn't know how long these physical benefits would last, and cautioned against viewing a month's sobriety as a quick fix. But they did conclude that the results were promising—and that they might be even more pronounced if people reduced their overall booze intake year-round.

[h/t Inside Science]

Now Ear This: A New App Can Detect a Child's Ear Infection

iStock.com/Techin24
iStock.com/Techin24

Generally speaking, using an internet connection to diagnose a medical condition is rarely recommended. But technology is getting better at outpacing skepticism over handheld devices guiding decisions and suggesting treatment relating to health care. The most recent example is an app that promises to identify one of the key symptoms of ear infections in kids.

The Associated Press reports that researchers at the University of Washington are close to finalizing an app that would allow a parent to assess whether or not their child has an ear infection using their phone, some paper, and some soft noises. A small piece of paper is folded into a funnel shape and inserted into the ear canal to focus the app's sounds (which resemble bird chirps) toward the child’s ear. The app measures sound waves bouncing off the eardrum. If pus or fluid is present, the sound waves will be altered, indicating a possible infection. The parent would then receive a text from the app notifying them of the presence of buildup in the middle ear.

The University of Washington tested the efficacy of the app by evaluating roughly 50 patients scheduled to undergo ear surgery at Seattle Children’s Hospital. The app was able to identify fluid in patients' ears about 85 percent of the time. That’s roughly as well as traditional exams, which involve visual identification as well as specialized acoustic devices.

While the system looks promising, not all cases of fluid in the ear are the result of infections or require medical attention. Parents would need to evaluate other symptoms, such as fever, if they intend to use the app to decide whether or not to seek medical attention. It may prove most beneficial in children with persistent fluid accumulation, a condition that needs to be monitored over the course of months when deciding whether a drain tube needs to be placed. Checking for fluid at home would save both time and money compared to repeated visits to a physician.

The app does not yet have Food and Drug Administration (FDA) approval and there is no timetable for when it might be commercially available. If it passes muster, it would join a number of FDA-approved “smart” medical diagnostic tools, including the AliveKor CardiaBand for the Apple Watch, which conducts EKG monitoring for heart irregularities.

[h/t WGRZ]

Does Having Allergies Mean That You Have A Decreased Immunity?

iStock.com/PeopleImages
iStock.com/PeopleImages

Tirumalai Kamala:

No, allergy isn't a sign of decreased immunity. It is a specific type of immune dysregulation. Autoimmunity, inflammatory disorders such as IBS and IBD, and even cancer are examples of other types of immune dysregulation.

Quality and target of immune responses and not their strength is the core issue in allergy. Let's see how.

—Allergens—substances known to induce allergy—are common. Some such as house dust mite and pollen are even ubiquitous.
—Everyone is exposed to allergens yet only a relative handful are clinically diagnosed with allergy.
—Thus allergens don't inherently trigger allergy. They can but only in those predisposed to allergy, not in everyone.
—Each allergic person makes pathological immune responses to not all but to only one or a few structurally related allergens while the non-allergic don't.
—Those diagnosed with allergy aren't necessarily more susceptible to other diseases.

If the immune response of each allergic person is selectively distorted when responding to specific allergens, what makes someone allergic? Obviously a mix of genetic and environmental factors.

[The] thing is allergy prevalence has spiked in recent decades, especially in developed countries, [which is] too short a time period for purely genetic mutation-based changes to be the sole cause, since that would take multiple generations to have such a population-wide effect. That tilts the balance towards environmental change, but what specifically?

Starting in the 1960s, epidemiologists began reporting a link between infections and allergy—[the] more infections in childhood, [the] less the allergy risk [this is called hygiene hypothesis]. Back then, microbiota weren't even a consideration but now we have learned better, so the hygiene hypothesis has expanded to include them.

Essentially, the idea is that the current Western style of living that rapidly developed over the 20th century fundamentally and dramatically reduced lifetime, and, crucially, early life exposure to environmental microorganisms, many of which would have normally become part of an individual's gut microbiota after they were born.

How could gut microbiota composition changes lead to selective allergies in specific individuals? Genetic predisposition should be taken as a given. However, natural history suggests that such predisposition transitioned to a full fledged clinical condition much more rarely in times past.

Let's briefly consider how that equation might have fundamentally changed in recent times. Consider indoor sanitation, piped chlorinated water, C-sections, milk formula, ultra-processed foods, lack of regular contact with farm animals (as a surrogate for nature) and profligate, ubiquitous, even excessive use of antimicrobial products such as antibiotics, to name just a few important factors.

Though some of these were beneficial in their own way, epidemiological data now suggests that such innovations in living conditions also disrupted the intimate association with the natural world that had been the norm for human societies since time immemorial. In the process such dramatic changes appear to have profoundly reduced human gut microbiota diversity among many, mostly in developed countries.

Unbeknownst to us, an epidemic of absence*, as Moises Velasquez-Manoff evocatively puts it, has thus been invisibly taking place across many human societies over the 20th century in lock-step with specific changes in living standards.

Such sudden and profound reduction in gut microbiota diversity thus emerges as the trigger that flips the normally hidden predisposition in some into clinically overt allergy. Actual mechanics of the process remain the subject of active research.

We (my colleague and I) propose a novel predictive mechanism for how disruption of regulatory T cell** function serves as the decisive and non-negotiable link between loss of specific microbiota and inflammatory disorders such as allergies. Time (and supporting data) will tell if we are right.

* An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases Reprint, Moises Velasquez-Manoff

** a small indispensable subset of CD4+ T cells.

This post originally appeared on Quora. Click here to view.

SECTIONS

arrow
LIVE SMARTER