The 98.6℉ Myth: Why Everything You Think You Know About Body Temperature Is a Lie

iStock
iStock

When you were kid, you probably knew that to score a magical sick day home from school, you needed to have a fever. When the thermometer came out of your mouth, it had to read higher than 98.6℉—the long-accepted "normal" human body temperature. (If you wanted to really seal the deal, you may have hoped to hit 100℉.) Since then, you may have used a temperature above 98.6℉ as a metric to work from home (or call out sick entirely).

But here's the thing: The average body temperature isn't actually 98.6℉—a fact that we've known for more than 25 years. The myth originated in the 19th century with a single doctor, and despite evidence to the contrary, it's persisted ever since.

THE GIANT—AND FAULTY—ARMPIT THERMOMETER

In 1851, Carl Wunderlich, the director of the hospital at Leipzig University, began going from room to room with a comically large thermometer in tow. He wanted to understand how body temperature is affected by different diseases, so in each room, he would hold the foot-long device in patients' armpits for a full 20 minutes, waiting for a temperature to register. Once it did, he'd note the temperature on the patient's chart (Wunderlich is thought to be the first physician to do so). He and his staff did this for years, repeatedly taking the temperatures of some 25,000 patients and logging them on their charts, until he had millions of readings. In 1868, he finally published this data in Das Verhalten der Eigenwarme in Krankheiten (On the Temperature in Diseases: A Manual of Medical Thermometry). He concluded that the average human body temperature was 98.6℉, underscoring the idea that fever is a symptom of illness, not a cause.

No one questioned Wunderlich's methods, or his average, for about 140 years. Then, in the early 1990s, internist Philip Mackowiak—a professor of medicine at the University of Maryland, a medical historian, and, apparently, a clinical thermometer junkie—saw one of the physician's instruments at the Mutter Museum in Philadelphia. He told the Freakonomics podcast that he'd always had doubts about the 98.6℉ standard. "I am by nature a skeptic," he said. "And it occurred to me very early in my career that this idea that 98.6 was normal, and then if you didn't have a temperature of 98.6, you were somehow abnormal, just didn't sit right."

Getting his hands on Wunderlich's thermometer—which the museum let him borrow—only deepened his doubts. The huge thermometer was unwieldy and non-registering, meaning, Mackowiak explained, "that it has to be read while it's in place." Not only that, but Wunderlich had used the device to measure temperatures in the armpit, which is less reliable than temperatures taken in the mouth or rectum. The instrument itself also wasn't terribly precise: It measured up to 2 degrees Centigrade higher than both ancient and modern instruments.

In 1992, Mackowiak decided to test Wunderlich's average. Using normal-sized oral thermometers and a group of volunteers, he determined that the average human body temperature actually hovers around 98.2℉. Mackowiak found that body temperature tends to vary over the course of the day, with its lowest point around 6 a.m. and its highest in the early evening. Body temperature can also fluctuate monthly (with the menstrual cycle) and over a lifetime (declining decade by decade with age), and may even be differentially linked to sex and race assignments. He concluded that normal body temperature is so unique to each person that it's almost like a fingerprint and, given that wide variation, not actually a very reliable indicator of illness.

As a result of his study, Mackowiak proposed raising the threshold for fever to 98.9℉ for temperatures taken in the morning (and 99.9℉ at other times). While it's a relatively minor change in terms of actual degrees, this fever threshold is actually lower than the CDC's, which is a temperature of 100.4℉ or higher.

There are potential real-life consequences in this gap, for everyone from students (who'd have to attend school with what would be considered a low-grade fever by Wunderlich's 98.6℉ standard) to employers and daycares (who use temperature to set attendance policies). What's more, anyone who is actually sick but ignores a low-grade fever—one that meets Mackowiak's threshold but still falls under the CDC's—could pose a risk to people with compromised immune systems trying to avoid unnecessary exposure to illness in public places.

THE BALANCING POINT

There's a reason the average trends near 98℉ instead of 92℉ or 106℉. As endotherms, mammals expend a great deal of energy maintaining body temperature when compared with cold-blooded creatures. To find and conserve a just-right body temperature, central nervous system sensors gather data (too warm? too cold? just right, Goldilocks?) and send that information to the pebble-sized hypothalamus near the base of the brain. There, the data is converted into action: releasing sweat and widening the blood vessels if too warm; raising metabolism, constricting the blood vessels, and inducing shivering if too cold.

According to a study by Aviv Bergman and Arturo Casadevall in the journal mBio, the precise balancing point for ideal body temperature is the sweet spot where the metabolic cost for all this thermoregulation balances with the evolutionary advantage of warding off fungal disease. (While warm-blooded animals are prone to bacterial or viral infections, they rarely experience fungal infections because most fungi can't withstand temperatures above 86℉. Cold-blooded animals, on the other hand, are prone to all three.) For Bergman and Casadevall, this benefit even explains what tipped Darwin's scales in favor of mammals, allowing them to edge out other vertebrates for dominance after the Cretaceous-Tertiary mass extinction wiped out the dinosaurs.

Of course, rules call for exceptions, and the one place where human body temperature demonstrates sustained elevation is outer space. Astronauts on prolonged missions clock significantly higher average body temperatures than they do when terrestrial—even up to 104℉. This so-called "space fever" is probably a product of some combination of radiation exposure, psychological stress, and immune response to weightlessness. Researchers believe this phenomenon could yield crucial information about thermoregulation—and may even offer insight into how humans might adapt to climate change.

WHY THE MYTH PERSISTS

It's been 26 years since Mackowiak's study, yet the newer data has not taken hold among medical professionals or the public. What gives?

Mackowiak tells Mental Floss that he finds it a bit mystifying that the myth persists, especially since many people, when pressed, know that the so-called "average" temperature varies. Part of the problem may be psychological: We cling to beliefs despite evidence to the contrary—a phenomenon called belief perseverance [PDF]. It's a significant force upholding a surprising number of medical myths. The idea humans should drink eight glasses of water a day? Not science. Sugar causes hyperactive behavior? Nope. Reading in dim light harms eyesight? Not really.

Unlearning persistent myths—especially ones loaded with the weight of medical authority—is difficult. "Deep down, under it all," Mackowiak says, "people want simple answers for things."

12 Facts About Diabetes Mellitus

iStock/mthipsorn
iStock/mthipsorn

Thirty million Americans—about 9 percent of the country's population—are living with diabetes mellitus, or simply diabetes. This chronic condition is characterized by sustained high blood sugar levels. In many patients, symptoms can be managed with insulin injections and lifestyle changes, but in others, the complications can be deadly. Here's what you need to know about diabetes mellitus.

1. There are three types of diabetes.

In healthy people, the pancreas produces enough of the hormone insulin to metabolize sugars into glucose and move the glucose into cells, where it's used for energy.

But people with type 2 diabetes—the most common form of the disease, accounting for about 95 percent of cases—either can't produce enough insulin to transport the sugars, or their cells have become insulin-resistant. The result is a buildup of glucose in the blood (a.k.a. high blood sugar or hyperglycemia). Type 2 diabetes typically develops in adults.

Type 1 diabetes, also known as juvenile diabetes, makes up the remaining 5 percent of chronic cases and most often develops in children and young adults. With this condition, the initial problem isn’t blood sugar levels, but insulin production: The pancreas can’t make enough insulin to process even normal amounts of glucose. The sugar builds up as a result, leading to dangerous concentrations in the bloodstream.

The third form, gestational diabetes, only afflicts pregnant people who weren’t diabetic before their pregnancy. The mother's blood glucose levels usually spike around the 24th week of pregnancy, but with a healthy diet, exercise, and insulin shots in some cases, diabetes symptoms usually can be managed. Blood sugar levels tend to return to normal in patients following their pregnancies.

2. The mellitus in diabetes mellitus means "honey sweet."

Around 3000 years ago, ancient Egyptians described a condition with diabetes-like symptoms, though it wasn't called diabetes yet. It took a few hundred years before the Greek physician Araetus of Cappodocia came up with the name diabetes based on the Greek word for "passing through" (as in passing a lot of urine, a common diabetes symptom). English doctor Thomas Willis tacked on the word mellitus, meaning "honey sweet," in 1675, building on previous physicians' observations that diabetic patients had sweet urine. Finally, in 1776, another English physician named Matthew Dobson confirmed that both the blood and urine of diabetes patients were made sweeter by high levels of glucose in their blood.

3. The cause of one type of diabetes is well understood; the other, not so much.

A person’s lifestyle is a key predictor of developing type 2 diabetes. Factors like being overweight or obese, consuming a high-calorie diet, smoking, and seldom exercising contribute to the risk. Foods and drinks that are high in sugar—soda, candy, ice cream, dessert— may contribute to hyperglycemia, but any food that’s high in calories, even if it's not sweet, can raise blood sugar levels.

In contrast to these well-established factors, medical experts aren’t entirely sure what causes type 1 diabetes. We do know that type 1 is an autoimmune disease that develops when the body attacks and damages insulin-producing cells in the pancreas. Some scientists think that environmental factors, like viruses, may trigger this immune response.

4. Family history also plays a role in diabetes risk.

If a parent or sibling has type 2 diabetes, you are predisposed to developing pre-diabetes and type 2 diabetes. Lifestyle habits explain some of these incidences, since family members may share similar diets and exercise habits. Genetics also play a role, but just because one close relative has diabetes does not mean you're destined to. Research conducted on identical twins, which share identical genes, showed that the pairs have discordant risk. Among twins in which one has type 1 diabetes, the other has only a 50 percent chance of developing it; for type 2, the risk for the second twin is 75 percent at most.

5. Racial minorities are at a higher risk for developing diabetes.

Many racial minority groups in the U.S. have a higher chance of developing type 2 diabetes. Black Americans, Latino Americans, Native Americans, Pacific Islanders, and some groups of Asian Americans are more likely to have pre-diabetes and type 2 diabetes than white Americans. This can be partly explained by the fact that some of these groups also have higher rates of obesity, which is one of the primary risk factors of type 2 diabetes. Socioeconomics may also play a role: One study shows that people with diabetes living in poverty are less likely to visit diabetes clinics and receive proper testing than their middle-income counterparts. According to another study, diabetic people without health insurance have higher blood sugar, blood pressure, and cholesterol rates than insured diabetics. Genetics, on the other hand, don’t appear to contribute to these trends.

6. Diabetes is one of the world's deadliest diseases.

With proper management, people with diabetes can live long, comfortable lives. But if the disease isn’t treated, it can have dire consequences. Diabetics make up the majority of people who develop chronic kidney disease, have adult-onset blindness, and need lower-limb amputations. In the most serious cases, diabetes leads to death. The condition is one of the deadliest diseases in the world, killing more people than breast cancer and AIDS combined.

7. Millions of Americans are pre-diabetic.

According to the CDC, 84 million adults living in the U.S. are pre-diabetic: Their blood sugar is higher than what’s considered safe, but hasn't yet reached diabetic level. In pre-diabetic patients, blood glucose levels after eight hours of fasting fall between 100 and 125 milligrams per deciliter, and diabetic levels are anything above that. People with pre-diabetes are not just at a greater risk for type 2 diabetes, but also for heart disease and stroke. Fortunately, people who are diagnosed with pre-diabetes can take steps to eat a healthier diet, increase physical activity, and test their blood glucose level several times a day to control the condition. In some cases, doctors will prescribe drugs like metformin that make the body more receptive to the insulin it produces.

8. After climbing for decades, rates of diabetes incidence are declining.

In the U.S., the rate of new diagnoses skyrocketed 382 percent between 1988 and 2014. Globally, 108 million people had diabetes in 1980, but by 2014 that number was 422 million.

But thanks to nationwide education and prevention efforts, the trend has reversed in the U.S., according to the CDC. Since peaking in 2009, the number of new diabetes cases in America has dropped by 35 percent. In that same timeframe, the number of people living with diagnosed diabetes in the U.S. has plateaued, suggesting people with the condition are living longer.

9. The first successful treatment for type 1 diabetes occurred in 1922.

Prior to the 20th century, type 1 diabetes was usually fatal. Diabetic ketoacidosis—a toxic buildup of chemicals called ketones, which arise when the body can no longer use glucose and instead breaks down other tissues for energy—killed most patients within a year or two of diagnosis. In searching for way to save children with juvenile (type 1) diabetes, Canadian physician Frederick Banting and medical student Charles Best built on the work of earlier researchers, who had demonstrated that removing the pancreas from a dog immediately caused diabetes symptoms in the animal. Banting and Best extracted insulin from dog pancreases in University of Toronto professor J.J.R. Macleod's lab. After injecting the insulin back into dogs whose pancreases had been removed, they realized the hormone regulated blood sugar levels. On January 11, 1922, they administered insulin to a human patient, and further refined the extract to reduce side effects. In 1923, Banting and Macleod received the Nobel Prize in Medicine for their work.

10. A pioneering physicist discovered the difference between type and and type 1 diabetes.

In the 1950s, physicist Rosalyn Yalow and her research partner Solomon Berson developed a method for measuring minute amounts of substances in blood. Inspired by Yalow's husband's struggle with diabetes, Yalow focused her research on insulin. Their "radioimmunoassay" technology revealed that some diabetes patients were still able to produce their own insulin, leading them to create two separate categories for the disease: “insulin-dependent” (type 1) and “non-insulin-dependent” (type 2). Prior to that discovery in 1959, there was no distinction between the two types. In 1977, Yalow won the 1977 Nobel Prize in Medicine for the radioimmunoassay, one of only 12 female Nobel laureates in medicine.

11. Making one insulin dose once required tons of pig parts.

Insulin is relatively easy to make today. Most of what's used in injections comes from a special non-disease-producing laboratory strain of E. coli bacteria that's been genetically modified to produce insulin, but that wasn't always the case. Until about 40 years ago, 2 tons of pig pancreases were required to produce just 8 ounces of pure insulin. The pig parts were typically recycled from pork farms.

12. A quarter of diabetes patients don’t know they have it.

The symptoms of type 2 diabetes can develop for years before patients think to ask their doctor about them. These include frequent urination, unexplained thirst, numbness in the extremities, dry skin, blurry vision, fatigue, and sores that are slow to heal—signs that may not be a cause for concern on their own, but together can indicate a more serious problem. Patients with type 1 diabetes may also experience nausea, vomiting, and stomach pain.

While serious, the symptoms of diabetes are sometimes easy to overlook. That’s why 25 percent of people with the illness, 7.2 million in the U.S., are undiagnosed. And that number doesn’t even cover the majority of people with pre-diabetes who aren’t aware they’re on their way to becoming diabetic.

FDA Is Warning Against Fecal Transplants After Person Dies From E. Coli Infection

iStock/artisteer
iStock/artisteer

Though it may sound gross, the benefits of a fecal transplant—taking the feces of one person and introducing it into the gastrointestinal tract of another—are promising for those suffering from a Clostridioides difficile infection. The tenacious infections are often the result of sustained antibiotic use, which can kill the patient's "good" gut bacteria and allow C. difficile to proliferate. As the theory goes, the “good” bacteria in feces transplanted from a healthy person may restore the infected person's microbiome and alleviate symptoms like life-threatening diarrhea.

The treatment, which is not FDA-approved, is risky. The FDA has announced that two people involved in a clinical trial recently received fecal transplants that contained drug-resistant bacteria, with one of them dying as a result.

According to The New York Times, the FDA did not offer details of either case, relating only that both patients were immunocompromised, which is one of the contraindications of receiving the transplant. The stool they received was believed to contain antibiotic-resistant E. coli bacteria.

As a result, the FDA is suspending a number of fecal transplant clinical trials until it can be determined how stool is being tested for contamination with potentially deadly bacteria and why the E. coli was not detected. The stool that infected both patients came from the same donor.

Fecal transplants are considered an experimental treatment for C. difficile infection when first-line treatment like antibiotics are ineffective. The fecal transplant is usually introduced to the digestive tract via pills or an infusion.

[h/t The New York Times]

SECTIONS

arrow
LIVE SMARTER