CLOSE
Original image
Retronaut.com

4 Diseases Caused by a Lack of Essential Vitamins and Minerals

Original image
Retronaut.com

Companies pushing products with added vitamins and minerals can fool people into thinking that they’re eating a “healthy” food when they’re not—but it’s not like those vitamins and minerals are there for no reason. For much of human history, diseases of nutrient deficiency were the norm, and in some parts of the world, they still persist. Even into the 20th century, conditions caused by a lack of certain vitamins or minerals were endemic to North America and Europe. Artificially added nutrients may not make a food “healthy,” but they do stave off several debilitating, and sometimes fatal, diseases of malnutrition. Here are a few of those maladies.

1. Scurvy


Courtesy of The Diseases of Infancy and Childhood

The disease of pirates: the grey-death. Scurvy is caused by a lack of vitamin C, whose chemical name, ascorbic acid, is derived from the Latin term for scurvy, scorbutus. Even though the disease was known since ancient times (described by Hippocrates around 400 BCE), it was not a scourge to those who were largely land-bound. Even though its causes were unknown, many cultures realized that eating certain herbs could reverse the symptoms, and as long as there was access to fresh food, it was generally kept under control.

Scurvy didn’t become a significant problem until the Age of Discovery (beginning in the 15th century), when people at sea were not able to access that much-needed fresh food for months at a time. Preserved meats and carbohydrates contained no vitamin C, and unlike most animals, the human body is not able to create vitamin C on its own.

The early symptoms of scurvy include spongy gums, pain in the joints, and blood spots appearing under the skin. As the disease progressed, the teeth would become loose, extreme halitosis (bad breath) would develop, the afflicted would become too weak to walk or work, be in too much pain to eat, and would die “mid-sentence,” often from a burst blood vessel. Many of the early explorers lost great numbers of men to scurvy: Vasco de Gama lost 116 out of 170 men in 1499, and in 1520, Magellan lost 208 out of 230. A few deaths were attributable to other causes, but the vast majority were due to scurvy.

Courtesy of TaussMarine.com

Despite not being able to pinpoint the exact cause of scurvy, in the 18th century, Naval physician James Lind was able to prove, in what’s considered to be the first controlled scientific experiment, that scurvy could be prevented (and cured) by incorporating citrus fruits such as limes and oranges into the diet of sailors. Although his findings weren’t widely accepted at first, the British Navy eventually began issuing standard rations of lemon juice, and later, limes, to their sailors—which gave rise to the term “limey” in reference to the British.

These days, scurvy is an extremely rare condition, almost exclusively caused by someone eating a completely unvaried diet. In most cases, high levels of oral supplementation of vitamin C are enough to reverse the condition in a matter of weeks, and death by scurvy is almost unheard of.

2. Rickets

Courtesy of Blatner.com

This condition is brought on by a lack of vitamin D, which causes the body to be unable to absorb or deposit calcium. Less commonly, it can also be caused by a lack of calcium or phosphorus, but vitamin D deficiency is by far the most common cause. Unlike vitamin C, the human body is able to produce vitamin D, but only if it has the metabolic precursors available to it.

When the skin is exposed to ultraviolet light (such as from the sun), cholesterol in the skin reacts and forms cholecalciferol, which is then processed in the liver and kidneys to create the active form of vitamin D. Even with a nominally healthy diet, without enough sun exposure, the body can’t produce the vitamin D precursors on its own. This is actually re-emerging as a health concern among some increasingly-indoor groups of people, and is one of the few hypovitaminosis (lack of vitamin) conditions not considered to be a “disease of the past.” Luckily, when the deficiency is recognized, cholecalciferol can be directly taken as a vitamin supplement or acquired from eating organ meats and oils, such as cod liver oil, allowing the body to resume producing vitamin D.

Rickets is a condition of children, as the deficiency’s most severe effects are on developing bones; in adults, “bone-softening,” or osteomalacia, can be caused by the same vitamin deficiency. But in adults, it both takes significantly longer to develop and tends to cause tip-off signs that something is wrong before bone warping sets in, such as extreme pain in the bones, and unexplained muscle weakness. In children, especially those that don’t or can’t receive regular check-ups, deformity and debilitation by the deficiency is often only noticed after significant damage has been done to their developing skeletons.

The most telling symptoms of rickets are at the epiphyses (growth plates) of bones: The body is unable to lengthen bones by depositing calcium, and ends up with bones that flare outward in a “cupping” appearance. This leads to costochondral swelling, or what’s known as the “rachitic rosary” along the ribcage of the child, as well as widened wrists and “thick” joints. Before widened wrists or rachitic rosary appears, the softening of the skull bones can lead to “Caput Quadratum”—a square-headed appearance, and often the first sign of skeletal growth problems. If left untreated, rickets also can cause an extremely curved back, stunted growth, and frequent fractures—all of which can lead to permanent and debilitating deformity.

3. Beriberi

Courtesy of Wikimedia Commons

This condition is largely confined to Asia, especially in countries where boiled rice is a staple. The Sinhalese term “beri-beri” means, “I cannot, I cannot,” and derives from the inability to perform even the simplest of tasks once the polyneuritis (nerve inflammation) caused by the deficiency of vitamin B1 (thiamine) has permanently damaged the neurons, when the condition has progressed to the end-stage.

Although beriberi was known to exist in rice-eating countries several centuries back, its prevalence boomed with the introduction of steam-driven rice-polishing mills from Europe. The superior taste of the milled white rice led many locals to abandon the local (unpolished) brown rice, and in doing so, abandon their primary source of thiamine. From the 1860s to the turn of the 20th century, people whose plant consumption was limited to the polished white rice would often come down with weakness, pain, weight loss, difficulty walking, and emotional disturbances. Beriberi became one of the leading causes of mortality in the region.

In the 1880s, a doctor named Christiaan Eijkman began researching the causes of this epidemic at a laboratory in the Dutch East Indies (now Jakarta, Indonesia), and initially believed that the condition was caused by a bacterial infection. However, after years of study, he came to the conclusion that “white rice is poisonous.” He discovered this by feeding a group of chickens solely white rice, and another group unpolished brown rice. The chickens that ate the white rice came down with beriberi-like symptoms, while the others stayed healthy. Eijkman also discovered that when the chickens fed white rice were subsequently fed brown rice, they recovered from their illness! Later dietary testing on prisoners confirmed his results. Even though he didn’t know the cause of the condition, Eijkman proved that white rice was the culprit, and shared the 1929 Nobel Prize in Medicine for his discovery.

Beriberi is occasionally seen in the modern world, but its primary cause is chronic alcoholism—the poor diets of some chronic alcoholics, combined with the decreased absorption of what thiamine is consumed, leads to symptoms that unfortunately are sometimes left undiagnosed until it’s too late. Recently, beriberi was also seen in Haitian prisons, when the prison system began buying imported polished rice from the United States, and stopped feeding their inmates the local brown rice.

4. Pellagra

Courtesy of Open Library

What causes blistering of the skin in the sun, pale skin, a craving for raw meat, blood dripping from the mouth, aggression, and insanity? If you answered “vampirism,” you’re close—the myth of the vampire may have its roots in the condition known as “pellagra.”

Pellagra is caused by a lack of vitamin B3 (niacin). First identified and commonly diagnosed in the Asturian Empire (now Northern Spain), it was originally called “Asturian leprosy.” However, the condition was seen throughout Europe, the Middle East, and North Africa, wherever a large percentage of food energy was derived from corn, and fresh meat was not available. The area of highest prevalence was Northern Italy, where Francesco Frapoli of Milan called it “pelle agra,” meaning “sour skin.”

It was initially believed that either the corn itself, or some insect associated with corn, was causing pellagra. This belief was reinforced when much of France eliminated corn as a food staple and virtually eradicated the condition. Between the era that corn was introduced to Europe (the early 16th century) and the late 19th century, pellagra was found almost everywhere that poor people subsisted on cornmeal and little else.

Around the turn of the 20th century, people began to notice that despite subsisting on just as much corn as poor Europeans, poor Mesoamerican natives didn’t come down with the condition. It was eventually discovered that this was because the traditional processing of corn in the Americas involved “nixtamalization,” in which the kernels were soaked in limewater before hulling them. The alkali solution freed up the niacin that was present in the grain, but previously inaccessible.

Despite the extensive work of Dr. Joseph Goldberger in the 1910s and 1920s, which proved that pellagra wasn’t caused by a germ but by a dietary deficiency, the condition was occurring in epidemic proportions in the rural Southern US until the 1940s.

Today, pellagra is most common in the poorest regions of the world, especially places that rely upon food aid programmes. Some countries still ship unfortified cornmeal rather than corn masa (nixtamalized corn) or fortified cornmeal to developing countries or to their own impoverished populations. China, parts of Africa, Indonesia, and North Korea all have endemic pellagra among their lowest classes.

*******

The discovery of important vitamins and how to produce them has been so significant to human health that many of those who were integral to the discoveries have been awarded the Nobel Prize in Medicine; more than 10 Nobel Prizes have been divided among almost 20 eminent scientists for the discovery or isolation of vitamins A, B1, B12, C, D, E, and K. Over the second half of the 20th century, after the beginning of widespread supplementation to everyday food items, the incidences of the conditions covered here went down dramatically across much of the world.

Of course, the minerals essential to the human body play similarly important roles in maintaining health. However, humans have not historically had a widespread significant problem acquiring these nutrients, as most plants absorb many minerals from the soil. With the increased processing of our food throughout the 20th century, however, some of these minerals have been lost, and have had to be re-added to the average Western diet through supplementation. In the rest of the world, displacement due to war, and unfortified food from aid programmes, has left survivors with enough calories, but not enough nutrients. Supplementation of assistance food and local fortification of salt and flour is beginning to help give displaced people (especially displaced children) a new chance at life without these and other nutritional diseases.

In the developed world, you won’t be the healthiest bloke on the block if you eat nothing but breakfast cereal and cartons of juice—but the food industry has ensured that you at least won’t die of malnutrition. Even people with healthy diets benefit from the supplementation of vitamins and minerals in common foodstuffs, and adding the nutrients costs next to nothing. Doctors and nutritionists still agree that the healthiest way to acquire your necessary vitamins and minerals is by eating a balanced diet and spending time outdoors each day, but in the course of modern life, that’s not always possible, and if people are going to eat poorly either way, we may as well keep them from dropping dead of scurvy!

Original image
iStock
arrow
science
Scientists Think They Know What Causes Trypophobia
Original image
iStock

Picture a boat hull covered with barnacles, a dried lotus seed pod, milk bubbles on a latte, or a honeycomb. Images of these objects are harmless—unless you're one of the millions of people suffering from trypophobia. Then they're likely to induce intense disgust, nausea, and fear, and make your skin crawl.

Coined fairly recently, the term trypophobia describes the fear of clusters of holes. The phobia isn’t recognized by the Diagnostic and Statistical Manual of Mental Disorders, but its visibility on the internet suggests that for many, it’s very real. Now, scientists in the UK think they've pinpointed the evolutionary mechanism behind the reaction.

Tom Kupfer of the University of Kent and An T. D. Le of the University of Essex shared their findings in the journal Cognition and Emotion. According to their research, trypophobia evolved as a way to avoid infectious disease. Thousands of years ago, if you saw a person covered in boils or a body covered in flies, a natural aversion to the sight would have helped you avoid catching whatever they had.

But being disgusted by skin riddled with pathogens or parasites alone doesn't mean you're trypophobic; after all, keeping your distance from potential infection is smart. But trypophobia seems to misplace that reaction, as the authors write: "Trypophobia may be an exaggerated and overgeneralized version of this normally adaptive response."

Lotus pod.
Lotus seed pods are a common trigger of trypophobia.

This explanation is not entirely new, but until now little research has been done into whether it's accurate. To test their hypothesis, the scientists recruited 376 self-described trypophobes from online forums, and another 304 college students who didn't claim to have the affliction. Both groups were shown two sets of images: The first depicted clusters of circle-shaped marks on animals and human body parts (the "disease-relevant cluster images"); the second showed clusters of holes on inanimate objects like bricks and flower pods ("disease-irrelevant cluster images"). While both groups reported feeling repulsed by the first collection of photographs, only the trypophobes felt the same about the pictures that had nothing to do with infection.

Another takeaway from the study is that trypophobia is more related to sensations of disgust than fear. This sets it apart from more common phobias like arachnophobia (fear of spiders) or acrophobia (fear of heights). And you don't have to be trypophobic to be disgusted by a video of Suriname toadlets being born through holes in their mother's back. We can all be grossed out by that.

Original image
iStock
arrow
Live Smarter
Researchers Say You’re Exercising More Than You Think
Original image
iStock

They say a journey of a thousand miles starts with a single step. If the thought of a thousand-mile journey makes you tired, we've got some great news for you: You've probably already completed one.* A new study published in the journal Health Psychology [PDF] finds that people underestimate the amount of exercise they're getting—and that this underestimation could be harmful.

Psychologists at Stanford University pulled data on 61,141 American adults from two huge studies conducted in the 1990s and the early 2000s: the National Health Interview Survey and the National Health and Nutrition Examination Survey. Participants answered questionnaires about their lifestyles, health, and exercise habits, and some wore accelerometers to track their movement. Everybody was asked one key question: "Would you say that you are physically more active, less active, or about as active as other persons your age?"

The researchers then tapped into the National Death Index through 2011 to find out which of the participants were still alive 10 to 20 years later.

Combining these three studies yielded two interesting facts. First, that many participants believed themselves to be less active than they actually were. Second, and more surprisingly, they found that people who rated themselves as "less active" were more likely to die—even when their actual activity rates told a different story. The reverse was also true: People who overestimated their exercise had lower mortality rates.

There are many reasons this could be the case. Depression and other mental illnesses can certainly influence both our self-perception and our overall health. The researchers attempted to control for this variable by checking participants' stress levels and asking if they'd seen a mental health professional in the last year. But not everybody who needs help can get it, and many people could have slipped through the cracks.

Paper authors Octavia Zahrt and Alia Crum have a different hypothesis. They say our beliefs about exercise could actually affect our risk of death. "Placebo effects are very robust in medicine," Crum said in a statement. "It is only logical to expect that they would play a role in shaping the benefits of behavioral health as well."

The data suggest that our ideas about exercise and exercise itself are two very different things. If all your friends are marathoners and mountain climbers, you might feel like a sloth—even if you regularly spend your lunch hour in yoga class.

Crum and Zahrt say we could all benefit from relaxing our definition of "exercise."

"Many people think that the only healthy physical activity is vigorous exercise in a gym or on a track," Zahrt told Mental Floss in an email. "They underestimate the importance of just walking to the store, taking the stairs, cleaning the house, or carrying the kids."
 
*The average American takes about 5000 steps per day, or roughly 2.5 miles. At that pace, it would take just a little over a year to walk 1000 miles.

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios