CLOSE
Retronaut.com
Retronaut.com

4 Diseases Caused by a Lack of Essential Vitamins and Minerals

Retronaut.com
Retronaut.com

Companies pushing products with added vitamins and minerals can fool people into thinking that they’re eating a “healthy” food when they’re not—but it’s not like those vitamins and minerals are there for no reason. For much of human history, diseases of nutrient deficiency were the norm, and in some parts of the world, they still persist. Even into the 20th century, conditions caused by a lack of certain vitamins or minerals were endemic to North America and Europe. Artificially added nutrients may not make a food “healthy,” but they do stave off several debilitating, and sometimes fatal, diseases of malnutrition. Here are a few of those maladies.

1. Scurvy

Courtesy of The Diseases of Infancy and Childhood

The disease of pirates: the grey-death. Scurvy is caused by a lack of vitamin C, whose chemical name, ascorbic acid, is derived from the Latin term for scurvy, scorbutus. Even though the disease was known since ancient times (described by Hippocrates around 400 BCE), it was not a scourge to those who were largely land-bound. Even though its causes were unknown, many cultures realized that eating certain herbs could reverse the symptoms, and as long as there was access to fresh food, it was generally kept under control.

Scurvy didn’t become a significant problem until the Age of Discovery (beginning in the 15th century), when people at sea were not able to access that much-needed fresh food for months at a time. Preserved meats and carbohydrates contained no vitamin C, and unlike most animals, the human body is not able to create vitamin C on its own.

The early symptoms of scurvy include spongy gums, pain in the joints, and blood spots appearing under the skin. As the disease progressed, the teeth would become loose, extreme halitosis (bad breath) would develop, the afflicted would become too weak to walk or work, be in too much pain to eat, and would die “mid-sentence,” often from a burst blood vessel. Many of the early explorers lost great numbers of men to scurvy: Vasco de Gama lost 116 out of 170 men in 1499, and in 1520, Magellan lost 208 out of 230. A few deaths were attributable to other causes, but the vast majority were due to scurvy.

Courtesy of TaussMarine.com

Despite not being able to pinpoint the exact cause of scurvy, in the 18th century, Naval physician James Lind was able to prove, in what’s considered to be the first controlled scientific experiment, that scurvy could be prevented (and cured) by incorporating citrus fruits such as limes and oranges into the diet of sailors. Although his findings weren’t widely accepted at first, the British Navy eventually began issuing standard rations of lemon juice, and later, limes, to their sailors—which gave rise to the term “limey” in reference to the British.

These days, scurvy is an extremely rare condition, almost exclusively caused by someone eating a completely unvaried diet. In most cases, high levels of oral supplementation of vitamin C are enough to reverse the condition in a matter of weeks, and death by scurvy is almost unheard of.

2. Rickets

Courtesy of Blatner.com

This condition is brought on by a lack of vitamin D, which causes the body to be unable to absorb or deposit calcium. Less commonly, it can also be caused by a lack of calcium or phosphorus, but vitamin D deficiency is by far the most common cause. Unlike vitamin C, the human body is able to produce vitamin D, but only if it has the metabolic precursors available to it.

When the skin is exposed to ultraviolet light (such as from the sun), cholesterol in the skin reacts and forms cholecalciferol, which is then processed in the liver and kidneys to create the active form of vitamin D. Even with a nominally healthy diet, without enough sun exposure, the body can’t produce the vitamin D precursors on its own. This is actually re-emerging as a health concern among some increasingly-indoor groups of people, and is one of the few hypovitaminosis (lack of vitamin) conditions not considered to be a “disease of the past.” Luckily, when the deficiency is recognized, cholecalciferol can be directly taken as a vitamin supplement or acquired from eating organ meats and oils, such as cod liver oil, allowing the body to resume producing vitamin D.

Rickets is a condition of children, as the deficiency’s most severe effects are on developing bones; in adults, “bone-softening,” or osteomalacia, can be caused by the same vitamin deficiency. But in adults, it both takes significantly longer to develop and tends to cause tip-off signs that something is wrong before bone warping sets in, such as extreme pain in the bones, and unexplained muscle weakness. In children, especially those that don’t or can’t receive regular check-ups, deformity and debilitation by the deficiency is often only noticed after significant damage has been done to their developing skeletons.

The most telling symptoms of rickets are at the epiphyses (growth plates) of bones: The body is unable to lengthen bones by depositing calcium, and ends up with bones that flare outward in a “cupping” appearance. This leads to costochondral swelling, or what’s known as the “rachitic rosary” along the ribcage of the child, as well as widened wrists and “thick” joints. Before widened wrists or rachitic rosary appears, the softening of the skull bones can lead to “Caput Quadratum”—a square-headed appearance, and often the first sign of skeletal growth problems. If left untreated, rickets also can cause an extremely curved back, stunted growth, and frequent fractures—all of which can lead to permanent and debilitating deformity.

3. Beriberi

Courtesy of Wikimedia Commons

This condition is largely confined to Asia, especially in countries where boiled rice is a staple. The Sinhalese term “beri-beri” means, “I cannot, I cannot,” and derives from the inability to perform even the simplest of tasks once the polyneuritis (nerve inflammation) caused by the deficiency of vitamin B1 (thiamine) has permanently damaged the neurons, when the condition has progressed to the end-stage.

Although beriberi was known to exist in rice-eating countries several centuries back, its prevalence boomed with the introduction of steam-driven rice-polishing mills from Europe. The superior taste of the milled white rice led many locals to abandon the local (unpolished) brown rice, and in doing so, abandon their primary source of thiamine. From the 1860s to the turn of the 20th century, people whose plant consumption was limited to the polished white rice would often come down with weakness, pain, weight loss, difficulty walking, and emotional disturbances. Beriberi became one of the leading causes of mortality in the region.

In the 1880s, a doctor named Christiaan Eijkman began researching the causes of this epidemic at a laboratory in the Dutch East Indies (now Jakarta, Indonesia), and initially believed that the condition was caused by a bacterial infection. However, after years of study, he came to the conclusion that “white rice is poisonous.” He discovered this by feeding a group of chickens solely white rice, and another group unpolished brown rice. The chickens that ate the white rice came down with beriberi-like symptoms, while the others stayed healthy. Eijkman also discovered that when the chickens fed white rice were subsequently fed brown rice, they recovered from their illness! Later dietary testing on prisoners confirmed his results. Even though he didn’t know the cause of the condition, Eijkman proved that white rice was the culprit, and shared the 1929 Nobel Prize in Medicine for his discovery.

Beriberi is occasionally seen in the modern world, but its primary cause is chronic alcoholism—the poor diets of some chronic alcoholics, combined with the decreased absorption of what thiamine is consumed, leads to symptoms that unfortunately are sometimes left undiagnosed until it’s too late. Recently, beriberi was also seen in Haitian prisons, when the prison system began buying imported polished rice from the United States, and stopped feeding their inmates the local brown rice.

4. Pellagra

Courtesy of Open Library

What causes blistering of the skin in the sun, pale skin, a craving for raw meat, blood dripping from the mouth, aggression, and insanity? If you answered “vampirism,” you’re close—the myth of the vampire may have its roots in the condition known as “pellagra.”

Pellagra is caused by a lack of vitamin B3 (niacin). First identified and commonly diagnosed in the Asturian Empire (now Northern Spain), it was originally called “Asturian leprosy.” However, the condition was seen throughout Europe, the Middle East, and North Africa, wherever a large percentage of food energy was derived from corn, and fresh meat was not available. The area of highest prevalence was Northern Italy, where Francesco Frapoli of Milan called it “pelle agra,” meaning “sour skin.”

It was initially believed that either the corn itself, or some insect associated with corn, was causing pellagra. This belief was reinforced when much of France eliminated corn as a food staple and virtually eradicated the condition. Between the era that corn was introduced to Europe (the early 16th century) and the late 19th century, pellagra was found almost everywhere that poor people subsisted on cornmeal and little else.

Around the turn of the 20th century, people began to notice that despite subsisting on just as much corn as poor Europeans, poor Mesoamerican natives didn’t come down with the condition. It was eventually discovered that this was because the traditional processing of corn in the Americas involved “nixtamalization,” in which the kernels were soaked in limewater before hulling them. The alkali solution freed up the niacin that was present in the grain, but previously inaccessible.

Despite the extensive work of Dr. Joseph Goldberger in the 1910s and 1920s, which proved that pellagra wasn’t caused by a germ but by a dietary deficiency, the condition was occurring in epidemic proportions in the rural Southern US until the 1940s.

Today, pellagra is most common in the poorest regions of the world, especially places that rely upon food aid programmes. Some countries still ship unfortified cornmeal rather than corn masa (nixtamalized corn) or fortified cornmeal to developing countries or to their own impoverished populations. China, parts of Africa, Indonesia, and North Korea all have endemic pellagra among their lowest classes.

*******

The discovery of important vitamins and how to produce them has been so significant to human health that many of those who were integral to the discoveries have been awarded the Nobel Prize in Medicine; more than 10 Nobel Prizes have been divided among almost 20 eminent scientists for the discovery or isolation of vitamins A, B1, B12, C, D, E, and K. Over the second half of the 20th century, after the beginning of widespread supplementation to everyday food items, the incidences of the conditions covered here went down dramatically across much of the world.

Of course, the minerals essential to the human body play similarly important roles in maintaining health. However, humans have not historically had a widespread significant problem acquiring these nutrients, as most plants absorb many minerals from the soil. With the increased processing of our food throughout the 20th century, however, some of these minerals have been lost, and have had to be re-added to the average Western diet through supplementation. In the rest of the world, displacement due to war, and unfortified food from aid programmes, has left survivors with enough calories, but not enough nutrients. Supplementation of assistance food and local fortification of salt and flour is beginning to help give displaced people (especially displaced children) a new chance at life without these and other nutritional diseases.

In the developed world, you won’t be the healthiest bloke on the block if you eat nothing but breakfast cereal and cartons of juice—but the food industry has ensured that you at least won’t die of malnutrition. Even people with healthy diets benefit from the supplementation of vitamins and minerals in common foodstuffs, and adding the nutrients costs next to nothing. Doctors and nutritionists still agree that the healthiest way to acquire your necessary vitamins and minerals is by eating a balanced diet and spending time outdoors each day, but in the course of modern life, that’s not always possible, and if people are going to eat poorly either way, we may as well keep them from dropping dead of scurvy!

nextArticle.image_alt|e
iStock
arrow
Animals
Owning a Dog May Add Years to Your Life, Study Shows
iStock
iStock

We've said that having a furry friend can reduce depression, promote better sleep, and encourage more exercise. Now, research has indicated that caring for a canine might actually extend your lifespan.

Previous studies have shown that dog owners have an innate sense of comfort and increased well-being. A new paper published in Scientific Reports and conducted by Uppsala University in Sweden looked at the health records of 3.4 million of the country's residents. These records typically include personal data like marital status and whether the individual owns a pet. Researchers got additional insight from a national dog registry providing ownership information. According to the study, those with a dog for a housemate were less likely to die from cardiovascular disease or any other cause during the study's 12-year duration.

The study included adults 40 to 80 years old, with a mean age of 57. Researchers found that dogs were a positive predictor in health, particularly among singles. Those who had one were 33 percent less likely to die early than those who did not. Authors didn't conclude the exact reason behind the correlation: It could be active people are more likely to own dogs, that dogs promoted more activity, or that psychological factors like lowered incidences of depression might bolster overall well-being. Either way, having a pooch in your life could mean living a longer one.

[h/t Bloomberg]

nextArticle.image_alt|e
iStock
arrow
Live Smarter
Not Sure About Your Tap Water? Here's How to Test for Contaminants
iStock
iStock

In the wake of Flint, Michigan's water crisis, you may have begun to wonder: Is my tap water safe? How would I know? To put your mind at ease—or just to satisfy your scientific curiosity—you can find out exactly what's in your municipal water pretty easily, as Popular Science reports. Depending on where you live, it might even be free.

A new water quality test called Tap Score, launched on Kickstarter in June 2017, helps you test for the most common household water contaminants for $120 per kit. You just need to take a few samples, mail them to the lab, and you'll get the results back in 10 days, telling you about lead levels, copper and cadmium content, arsenic, and other common hazardous materials that can make their way into water via pipes or wells. If you're mostly worried about lead, you can get a $40 test that only tells you about the lead and copper content of your water.

In New York State, a free lead-testing program will send you a test kit on request that allows you to send off samples of your water to a state-certified lab for processing, no purchase required. A few weeks later, you'll get a letter with the results, telling you what kind of lead levels were found in your water. This option is great if you live in New York, but if your state doesn't offer free testing (or only offers it to specific locations, like schools), there are other budget-friendly ways to test, too.

While mailing samples of your water off to a certified lab is the most accurate way to test your water, you can do it entirely at home with inexpensive strip tests that will only set you back $10 to $15. These tests aren't as sensitive as lab versions, and they don't test for as many contaminants, but they can tell you roughly whether you should be concerned about high levels of toxic metals like lead. The strip tests will only give you positive or negative readings, though, whereas the EPA and other official agencies test for the concentration of contaminants (the parts-per-billion) to determine the safety of a water source. If you're truly concerned with what's in your water, you should probably stick to sending your samples off to a professional, since you'll get a more detailed report of the results from a lab than from a colored strip.

In the future, there will likely be an even quicker way to test for lead and other metals—one that hooks up to your smartphone. Gitanjali Rao, an 11-year-old from Colorado, won the 2017 Young Scientist Challenge by inventing Tethys, a faster lead-testing device than what's currently on the market. With Tethys, instead of waiting for a lab, you can get results instantly. It's not commercially available yet, though, so for now, we'll have to stick with mail-away options.

[h/t Popular Science]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios