Why Not Brushing Your Teeth Is So Bad For Your Health

iStock
iStock

Even if you lie about flossing, consume a steady diet of sugar, and haven't been to the dentist in years, the very least you can do for your mouth is brush your teeth twice a day. Regular brushing is a basic hygiene practice that's observed across the globe, but just how necessary is it to human survival? In their new video, Life Noggin explains why the benefits of brushing extend beyond your pearly whites.

If you stopped brushing your teeth, all the bacteria and bits of food normally cleared out by your twice-daily cleanings would thrive unchecked. Cavities would start to form, your gums would become inflamed, and your breath would reach room-clearing levels of stench. But the sorry state of your mouth would be just a fraction of the problem. With the opening acting as a haven for germs, harmful bacteria like MRSA and Staphylococcus aureus would have a greater likelihood of surviving there long enough to enter your bloodstream. A dirty mouth can also nourish the bacteria Porphyromonas gingivalis, a bug that causes an advanced stage of gingivitis that may lead plaque to build up inside the arteries if it enters the heart. So when your dentist harps on you about brushing regularly, they're not being overly dramatic.

With the potential to prevent life-threatening conditions, you may wonder how the rest of the animal kingdom gets along without oral hygiene. While some creatures have developed teeth that replace themselves quickly or teeth that resist erosion, others make their own toothbrushes. Elephants, for example, wipe bacteria off their tusks whenever they scrape bark from a tree or dig a hole. Humans in ancient times used a similar method: Before the first toothbrush was invented, they would chew on bark and scrub their teeth with the tattered ends to get that refreshing clean feeling.

Check out the full story from Life Noggin in the video below.

[h/t Life Noggin]

Now Ear This: A New App Can Detect a Child's Ear Infection

iStock.com/Techin24
iStock.com/Techin24

Generally speaking, using an internet connection to diagnose a medical condition is rarely recommended. But technology is getting better at outpacing skepticism over handheld devices guiding decisions and suggesting treatment relating to health care. The most recent example is an app that promises to identify one of the key symptoms of ear infections in kids.

The Associated Press reports that researchers at the University of Washington are close to finalizing an app that would allow a parent to assess whether or not their child has an ear infection using their phone, some paper, and some soft noises. A small piece of paper is folded into a funnel shape and inserted into the ear canal to focus the app's sounds (which resemble bird chirps) toward the child’s ear. The app measures sound waves bouncing off the eardrum. If pus or fluid is present, the sound waves will be altered, indicating a possible infection. The parent would then receive a text from the app notifying them of the presence of buildup in the middle ear.

The University of Washington tested the efficacy of the app by evaluating roughly 50 patients scheduled to undergo ear surgery at Seattle Children’s Hospital. The app was able to identify fluid in patients' ears about 85 percent of the time. That’s roughly as well as traditional exams, which involve visual identification as well as specialized acoustic devices.

While the system looks promising, not all cases of fluid in the ear are the result of infections or require medical attention. Parents would need to evaluate other symptoms, such as fever, if they intend to use the app to decide whether or not to seek medical attention. It may prove most beneficial in children with persistent fluid accumulation, a condition that needs to be monitored over the course of months when deciding whether a drain tube needs to be placed. Checking for fluid at home would save both time and money compared to repeated visits to a physician.

The app does not yet have Food and Drug Administration (FDA) approval and there is no timetable for when it might be commercially available. If it passes muster, it would join a number of FDA-approved “smart” medical diagnostic tools, including the AliveKor CardiaBand for the Apple Watch, which conducts EKG monitoring for heart irregularities.

[h/t WGRZ]

Does Having Allergies Mean That You Have A Decreased Immunity?

iStock.com/PeopleImages
iStock.com/PeopleImages

Tirumalai Kamala:

No, allergy isn't a sign of decreased immunity. It is a specific type of immune dysregulation. Autoimmunity, inflammatory disorders such as IBS and IBD, and even cancer are examples of other types of immune dysregulation.

Quality and target of immune responses and not their strength is the core issue in allergy. Let's see how.

—Allergens—substances known to induce allergy—are common. Some such as house dust mite and pollen are even ubiquitous.
—Everyone is exposed to allergens yet only a relative handful are clinically diagnosed with allergy.
—Thus allergens don't inherently trigger allergy. They can but only in those predisposed to allergy, not in everyone.
—Each allergic person makes pathological immune responses to not all but to only one or a few structurally related allergens while the non-allergic don't.
—Those diagnosed with allergy aren't necessarily more susceptible to other diseases.

If the immune response of each allergic person is selectively distorted when responding to specific allergens, what makes someone allergic? Obviously a mix of genetic and environmental factors.

[The] thing is allergy prevalence has spiked in recent decades, especially in developed countries, [which is] too short a time period for purely genetic mutation-based changes to be the sole cause, since that would take multiple generations to have such a population-wide effect. That tilts the balance towards environmental change, but what specifically?

Starting in the 1960s, epidemiologists began reporting a link between infections and allergy—[the] more infections in childhood, [the] less the allergy risk [this is called hygiene hypothesis]. Back then, microbiota weren't even a consideration but now we have learned better, so the hygiene hypothesis has expanded to include them.

Essentially, the idea is that the current Western style of living that rapidly developed over the 20th century fundamentally and dramatically reduced lifetime, and, crucially, early life exposure to environmental microorganisms, many of which would have normally become part of an individual's gut microbiota after they were born.

How could gut microbiota composition changes lead to selective allergies in specific individuals? Genetic predisposition should be taken as a given. However, natural history suggests that such predisposition transitioned to a full fledged clinical condition much more rarely in times past.

Let's briefly consider how that equation might have fundamentally changed in recent times. Consider indoor sanitation, piped chlorinated water, C-sections, milk formula, ultra-processed foods, lack of regular contact with farm animals (as a surrogate for nature) and profligate, ubiquitous, even excessive use of antimicrobial products such as antibiotics, to name just a few important factors.

Though some of these were beneficial in their own way, epidemiological data now suggests that such innovations in living conditions also disrupted the intimate association with the natural world that had been the norm for human societies since time immemorial. In the process such dramatic changes appear to have profoundly reduced human gut microbiota diversity among many, mostly in developed countries.

Unbeknownst to us, an epidemic of absence*, as Moises Velasquez-Manoff evocatively puts it, has thus been invisibly taking place across many human societies over the 20th century in lock-step with specific changes in living standards.

Such sudden and profound reduction in gut microbiota diversity thus emerges as the trigger that flips the normally hidden predisposition in some into clinically overt allergy. Actual mechanics of the process remain the subject of active research.

We (my colleague and I) propose a novel predictive mechanism for how disruption of regulatory T cell** function serves as the decisive and non-negotiable link between loss of specific microbiota and inflammatory disorders such as allergies. Time (and supporting data) will tell if we are right.

* An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases Reprint, Moises Velasquez-Manoff

** a small indispensable subset of CD4+ T cells.

This post originally appeared on Quora. Click here to view.

SECTIONS

arrow
LIVE SMARTER