4 Diseases Caused by a Lack of Essential Vitamins and Minerals

iStock
iStock

Companies pushing products with added vitamins and minerals can fool people into thinking that they’re eating a “healthy” food when they’re not—but it’s not like those vitamins and minerals are there for no reason. For much of human history, diseases of nutrient deficiency were the norm, and in some parts of the world, they still persist. Even into the 20th century, conditions caused by a lack of certain vitamins or minerals were endemic to North America and Europe. Artificially added nutrients may not make a food “healthy,” but they do stave off several debilitating, and sometimes fatal, diseases of malnutrition. Here are a few of those maladies.

1. Scurvy

The disease of pirates: the grey-death. Scurvy is caused by a lack of vitamin C, whose chemical name, ascorbic acid, is derived from the Latin term for scurvy, scorbutus. Even though the disease was known since ancient times (described by Hippocrates around 400 BCE), it was not a scourge to those who were largely land-bound. Even though its causes were unknown, many cultures realized that eating certain herbs could reverse the symptoms, and as long as there was access to fresh food, it was generally kept under control.

Scurvy didn’t become a significant problem until the Age of Discovery (beginning in the 15th century), when people at sea were not able to access that much-needed fresh food for months at a time. Preserved meats and carbohydrates contained no vitamin C, and unlike most animals, the human body is not able to create vitamin C on its own.

The early symptoms of scurvy include spongy gums, pain in the joints, and blood spots appearing under the skin. As the disease progressed, the teeth would become loose, extreme halitosis (bad breath) would develop, the afflicted would become too weak to walk or work, be in too much pain to eat, and would die “mid-sentence,” often from a burst blood vessel. Many of the early explorers lost great numbers of men to scurvy: Vasco de Gama lost 116 out of 170 men in 1499, and in 1520, Magellan lost 208 out of 230. A few deaths were attributable to other causes, but the vast majority were due to scurvy.

Despite not being able to pinpoint the exact cause of scurvy, in the 18th century, naval physician James Lind was able to prove, in what’s considered to be the first controlled scientific experiment, that scurvy could be prevented (and cured) by incorporating citrus fruits such as limes and oranges into the diet of sailors. Although his findings weren’t widely accepted at first, the British Navy eventually began issuing standard rations of lemon juice, and later, limes, to their sailors—which gave rise to the term “limey” in reference to the British.

These days, scurvy is an extremely rare condition, almost exclusively caused by someone eating a completely unvaried diet. In most cases, high levels of oral supplementation of vitamin C are enough to reverse the condition in a matter of weeks, and death by scurvy is almost unheard of.

2. Rickets

This condition is brought on by a lack of vitamin D, which causes the body to be unable to absorb or deposit calcium. Less commonly, it can also be caused by a lack of calcium or phosphorus, but vitamin D deficiency is by far the most common cause. Unlike vitamin C, the human body is able to produce vitamin D, but only if it has the metabolic precursors available to it.

When the skin is exposed to ultraviolet light (such as from the sun), cholesterol in the skin reacts and forms cholecalciferol, which is then processed in the liver and kidneys to create the active form of vitamin D. Even with a nominally healthy diet, without enough sun exposure, the body can’t produce the vitamin D precursors on its own. This is actually re-emerging as a health concern among some increasingly-indoor groups of people, and is one of the few hypovitaminosis (lack of vitamin) conditions not considered to be a “disease of the past.” Luckily, when the deficiency is recognized, cholecalciferol can be directly taken as a vitamin supplement or acquired from eating organ meats and oils, such as cod liver oil, allowing the body to resume producing vitamin D.

Rickets is a condition of children, as the deficiency’s most severe effects are on developing bones; in adults, “bone-softening,” or osteomalacia, can be caused by the same vitamin deficiency. But in adults, it both takes significantly longer to develop and tends to cause tip-off signs that something is wrong before bone warping sets in, such as extreme pain in the bones, and unexplained muscle weakness. In children, especially those that don’t or can’t receive regular check-ups, deformity and debilitation by the deficiency is often only noticed after significant damage has been done to their developing skeletons.

The most telling symptoms of rickets are at the epiphyses (growth plates) of bones: The body is unable to lengthen bones by depositing calcium, and ends up with bones that flare outward in a “cupping” appearance. This leads to costochondral swelling, or what’s known as the “rachitic rosary” along the ribcage of the child, as well as widened wrists and “thick” joints. Before widened wrists or rachitic rosary appears, the softening of the skull bones can lead to “Caput Quadratum”—a square-headed appearance, and often the first sign of skeletal growth problems. If left untreated, rickets also can cause an extremely curved back, stunted growth, and frequent fractures—all of which can lead to permanent and debilitating deformity.

3. Beriberi

This condition is largely confined to Asia, especially in countries where boiled rice is a staple. The Sinhalese term “beri-beri” means, “I cannot, I cannot,” and derives from the inability to perform even the simplest of tasks once the polyneuritis (nerve inflammation) caused by the deficiency of vitamin B1 (thiamine) has permanently damaged the neurons, when the condition has progressed to the end-stage.

Although beriberi was known to exist in rice-eating countries several centuries back, its prevalence boomed with the introduction of steam-driven rice-polishing mills from Europe. The superior taste of the milled white rice led many locals to abandon the local (unpolished) brown rice, and in doing so, abandon their primary source of thiamine. From the 1860s to the turn of the 20th century, people whose plant consumption was limited to the polished white rice would often come down with weakness, pain, weight loss, difficulty walking, and emotional disturbances. Beriberi became one of the leading causes of mortality in the region.

In the 1880s, a doctor named Christiaan Eijkman began researching the causes of this epidemic at a laboratory in the Dutch East Indies (now Jakarta, Indonesia), and initially believed that the condition was caused by a bacterial infection. However, after years of study, he came to the conclusion that “white rice is poisonous.” He discovered this by feeding a group of chickens solely white rice, and another group unpolished brown rice. The chickens that ate the white rice came down with beriberi-like symptoms, while the others stayed healthy. Eijkman also discovered that when the chickens fed white rice were subsequently fed brown rice, they recovered from their illness! Later dietary testing on prisoners confirmed his results. Even though he didn’t know the cause of the condition, Eijkman proved that white rice was the culprit, and shared the 1929 Nobel Prize in Medicine for his discovery.

Beriberi is occasionally seen in the modern world, but its primary cause is chronic alcoholism—the poor diets of some chronic alcoholics, combined with the decreased absorption of what thiamine is consumed, leads to symptoms that unfortunately are sometimes left undiagnosed until it’s too late. Recently, beriberi was also seen in Haitian prisons when the prison system began buying imported polished rice from the United States, and stopped feeding their inmates the local brown rice.

4. Pellagra

What causes blistering of the skin in the sun, pale skin, a craving for raw meat, blood dripping from the mouth, aggression, and insanity? If you answered “vampirism,” you’re close—the myth of the vampire may have its roots in the condition known as “pellagra.”

Pellagra is caused by a lack of vitamin B3 (niacin). First identified and commonly diagnosed in the Asturian Empire (now Northern Spain), it was originally called “Asturian leprosy.” However, the condition was seen throughout Europe, the Middle East, and North Africa, wherever a large percentage of food energy was derived from corn, and fresh meat was not available. The area of highest prevalence was Northern Italy, where Francesco Frapoli of Milan called it “pelle agra,” meaning “sour skin.”

It was initially believed that either the corn itself, or some insect associated with corn, was causing pellagra. This belief was reinforced when much of France eliminated corn as a food staple and virtually eradicated the condition. Between the era that corn was introduced to Europe (the early 16th century) and the late 19th century, pellagra was found almost everywhere that poor people subsisted on cornmeal and little else.

Around the turn of the 20th century, people began to notice that despite subsisting on just as much corn as poor Europeans, poor Mesoamerican natives didn’t come down with the condition. It was eventually discovered that this was because the traditional processing of corn in the Americas involved “nixtamalization,” in which the kernels were soaked in limewater before hulling them. The alkali solution freed up the niacin that was present in the grain, but previously inaccessible.

Despite the extensive work of Dr. Joseph Goldberger in the 1910s and 1920s, which proved that pellagra wasn’t caused by a germ but by a dietary deficiency, the condition was occurring in epidemic proportions in the rural Southern US until the 1940s.

Today, pellagra is most common in the poorest regions of the world, especially places that rely upon food aid programs. Some countries still ship unfortified cornmeal rather than corn masa (nixtamalized corn) or fortified cornmeal to developing countries or to their own impoverished populations. China, parts of Africa, Indonesia, and North Korea all have endemic pellagra among their lowest classes.

*******

The discovery of important vitamins and how to produce them has been so significant to human health that many of those who were integral to the discoveries have been awarded the Nobel Prize in Medicine; more than 10 Nobel Prizes have been divided among almost 20 eminent scientists for the discovery or isolation of vitamins A, B1, B12, C, D, E, and K. Over the second half of the 20th century, after the beginning of widespread supplementation to everyday food items, the incidences of the conditions covered here went down dramatically across much of the world.

Of course, the minerals essential to the human body play similarly important roles in maintaining health. However, humans have not historically had a widespread significant problem acquiring these nutrients, as most plants absorb many minerals from the soil. With the increased processing of our food throughout the 20th century, however, some of these minerals have been lost, and have had to be re-added to the average Western diet through supplementation. In the rest of the world, displacement due to war, and unfortified food from aid programs, has left survivors with enough calories, but not enough nutrients. Supplementation of assistance food and local fortification of salt and flour is beginning to help give displaced people (especially displaced children) a new chance at life without these and other nutritional diseases.

In the developed world, you won’t be the healthiest bloke on the block if you eat nothing but breakfast cereal and cartons of juice—but the food industry has ensured that you at least won’t die of malnutrition. Even people with healthy diets benefit from the supplementation of vitamins and minerals in common foodstuffs, and adding the nutrients costs next to nothing. Doctors and nutritionists still agree that the healthiest way to acquire your necessary vitamins and minerals is by eating a balanced diet and spending time outdoors each day, but in the course of modern life, that’s not always possible, and if people are going to eat poorly either way, we may as well keep them from dropping dead of scurvy!

7 Ways Victorian Fashion Could Kill You

An 1862 engraving showing a skeleton gentleman at a ball asking a skeleton lady to dance, meant to represent the effect of arsenic dyes and pigments in clothing and accessories.
An 1862 engraving showing a skeleton gentleman at a ball asking a skeleton lady to dance, meant to represent the effect of arsenic dyes and pigments in clothing and accessories.

While getting dressed in the morning can seem like a hassle (pajamas are so much more comfortable), few of us worry about our clothes leading to our death. That wasn’t the case during the Victorian era, when fashionable fabrics and accessories sometimes came at great price for both makers and wearers. In Fashion Victims: The Dangers of Dress Past and Present, Alison Matthews David, a professor in the School of Fashion at Ryerson University in Toronto, outlines the many toxic, flammable, and otherwise highly hazardous components of high style during the 19th century. Here are a few of the worst offenders.

1. Poisonous Dyes

A drawing of Victorian fashions likely made with arsenic dyes
A drawing of Victorian fashions likely made with arsenic dyes
Bloomsbury Visual Arts

Before the 1780s, green was a tricky color to create on clothes, and dressmakers depended on a combination of yellow and blue dyes to produce the hue. But in the late 1770s a Swedish/German chemist named Carl Wilhelm Scheele invented a new green pigment by mixing potassium and white arsenic on a solution of copper vitriol. The pigment was dubbed Scheele’s Green, and later Paris Green, among other names, and it became a huge sensation, used to color walls, paintings, and fabrics as well as candles, candies, food wrappers, and even children’s toys. Not surprisingly, it also caused sores, scabs, and damaged tissue, as well as nausea, colic, diarrhea, and constant headaches.

Although fashionable women wore arsenic-dyed fabrics—even Queen Victoria was depicted in one—its health effects were worst among the textile and other workers who created the clothes and often labored in warm, arsenic-impregnated rooms day after day. (Some scholars have even theorized that Napoleon might have been poisoned by the arsenic-laced wallpaper hung in his St. Helena home.)

Arsenical dyes were also a popular addition to artificial flowers and leaves, which meant they were frequently pinned to clothes or fastened on heads. In the 1860s, a report commissioned by the Ladies’ Sanitary Association found that the average headdress contained enough arsenic to poison 20 people. The British Medical Journal wrote of the green-clad Victorian woman: “She actually carries in her skirts poison enough to slay the whole of the admirers she may meet with in half a dozen ball-rooms.” Despite repeated warnings in the press, and from doctors and scientists, the Victorians seemed in love with emerald green arsenic dyes; ironically, they acted like a reminder of the nature then swiftly being lost to industrialization, David says.

2. Pestilential Fabrics

Soldiers of the Victorian era (and earlier) were plagued by lice and other body parasites that carried deadly diseases such as typhus and trench fever. But soldiers weren’t the only victims of disease carried via fabric—even the wealthy sometimes wore clothing that was made or cleaned by the sick in sweatshops or tenements, and which spread disease as a result. According to David, the daughter of Victorian Prime Minister Sir Robert Peel died after her riding habit, given to her by her father as a gift, was finished in the house of a poor seamstress who had used it to cover her sick husband as he lay shivering with typhus-induced chills. Peel’s daughter contracted typhus after wearing the garment, and died on the eve of her wedding.

Women also worried about their skirts sweeping through the muck and excrement of city streets, where bacteria was rife, and some wore special skirt-fasteners to keep them up from the gunk. The poor, who often wore secondhand clothes, suffered from smallpox and other diseases spread by fabric that was recycled without being properly washed.

3. Flowing Skirts

Giant, ruffled, crinoline-supported skirts may have been fine for ladies of leisure, but they weren’t a great combination with industrial machinery. According to David, one mill in Lancashire posted a sign in 1860 forbidding the “present ugly fashion of HOOPS, or CRINOLINE, as it is called” as being “quite unfitted for the work of our Factories.” The warning was a wise one: In at least one printing office, a girl was caught by her crinoline and dragged under the mechanical printing press. The girl was reportedly “very slim” and escaped unharmed, but the foreman banned the skirts anyway. Long, large, or draped skirts were also an unfortunate combination with carriages and animals.

4. Flammable Fabrics

A woman with her crinoline on fire
Bloomsbury Visual Arts

The flowing white cotton so popular in the late 18th and 19th centuries had dangers to both maker and wearer: It was produced with often-brutal slave labor on plantations, and it was also more flammable than the heavy silks and wool favored by the wealthy in the previous centuries. One type of cotton lace was particularly problematic: In 1809 John Heathcoat patented a machine that made the first machine-woven silk and cotton pillow “lace” or bobbinet, now better known as tulle, which could catch fire in an instant. The tulle was frequently layered, to add volume and compensate for its sheerness, and stiffened with highly combustible starch. Ballerinas were particularly at risk: British ballerina Clara Webster died in 1844 when her dress caught fire at London’s Drury Lane theatre after her skirt came too close to sunken lights onstage.

But performers weren’t the only ones in peril: Even the average woman wearing the then-popular voluminous crinolines was at risk of setting herself ablaze. And the “flannelette” (plain cotton brushed to create a nap and resemble wool flannel) so popular for nightshirts and undergarments was particularly combustible if hit with a stray spark or the flame of a household candle. So many children burned in household accidents that one company came out with a specially treated flannelette called Non-Flam, advertised as being “strong’y recommended by Coroners.”

5. Arsenic-Ridden Taxidermy

Dead birds were a popular addition to ladies’ hats in the 19th century. According to David, “fashions in millinery killed millions of small songbirds and introduced dangers that may still make some historic women’s hats harmful to humans today.”

But it wasn’t the birds that were the problem—it was the arsenic used on them. Taxidermists of the day used arsenic-laced soaps and other products to preserve birds and other creatures. In some cases, entire birds—one or several—were mounted on hats. Some Victorian fashion commentators decried the practice, though not because of the arsenic involved. One Mrs. Haweis, a writer on dress and beauty, began an 1887 diatribe against “smashed birds” with the sentence: “A corpse is never a really pleasant ornament.”

6. Mercury

No upper-class man of the Victorian era was complete without his hat, but many of those hats were made with mercury. As David explains, “Although its noxious effects were known, it was the cheapest and most efficient way to turn stiff, low-grade fur from rabbits and hares into malleable felt.” Mercury gave animal fur its smooth, glossy, matted texture, but that velvety look came at a high cost—mercury is an extremely dangerous substance.

Mercury can rapidly enter the body through the skin or the air, and causes a range of horrible health effects. Hatters were known to suffer from convulsions, abdominal cramps, trembling, paralysis, reproductive problems, and more. (A chemistry professor studying toxic exposure at Dartmouth College, Karen Wetterhahn, died in 1996 after spilling just a few drops of a supertoxic type of mercury on her glove.) To make matters worse, hatters who drank while they worked (not an uncommon practice) only hastened mercury’s effects by hampering the liver’s ability to eliminate it. While scholars still debate whether Lewis Carroll’s “mad hatter” was meant to show the effects of mercury poisoning, his trembling limbs and wacky speech seem to fit the bill.

7. Lead

A Victorian facial cream containing lead
A Victorian facial cream containing lead
Bloomsbury Visual Arts

Pallor was definitely in during the Victorian era, and a face spackled with lead white paint was long favored by fashionable women. Lead had been a popular ingredient in cosmetics for centuries, David writes, because it “made colors even and opaque and created a desirable ‘whiteness’ that bespoke both freedom from hard outdoor labor and racial purity.” One of the most popular lead-laced cosmetic products was called Laird’s Bloom of Youth; in 1869, one of the founders of the American Medical Association treated three young women who had been using the product and temporarily lost full use of their hands and wrists as a result. (The doctor described the condition as “lead palsy,” although today we call it wrist drop or radial nerve palsy, which can be caused by lead poisoning.) One of the women’s hands was said to be “wasted to a skeleton.”

This article was republished in 2019.

The 25 Highest-Paying Entry-Level Jobs for New Graduates

iStock/kali9
iStock/kali9

When they finish their final exams, college seniors can look forward to job hunting. Roughly 1.9 million students in the U.S. will receive their bachelor's degrees this school year, and while some new graduates may be happy to take the first job they're offered, others will be looking for something that pays well—even at the entry level. According to Glassdoor, recent grads qualified for the 25 jobs below will have the best luck.

To compile this list of the highest-paying entry-level jobs in the U.S., the job search website identified employment opportunities with the highest median bases salaries reported by users 25 or younger. Positions in the tech industry dominate the list. Aspiring data scientists can expect to make $95,000 a year at their first job out of college, while software engineers have a median annual base salary of $90,000. Other entry-level tech jobs like UX designer, Java developer, and systems engineer all start at salaries of $70,000 or more.

Banking and business positions, including investment banking analysta ($85,000), actuarial analysts ($66,250), and business analysts ($63,000), appear on the list as well. The only listed position that doesn't fall under the tech, finance, or business categories is for physical therapists, who report a median starting salary of $63,918.

You can check out the full list of the 25 highest-paying entry-level jobs below.

  1. Data Scientist // $95,000
  2. Software Engineer // $90,000
  3. Product Manager // $89,000
  4. Investment Banking Analyst // $85,000
  5. Product Designer // $85,000
  6. UX Designer // $73,000
  7. Implementation Consultant // $72,000
  8. Java Developer // $72,000
  9. Systems Engineer // $70,000
  10. Software Developer // $68,600
  11. Process Engineer // $68,258
  12. Front End Developer // $67,500
  13. Product Engineer // $66,750
  14. Actuarial Analyst // $66,250
  15. Electrical Engineer // $66,000
  16. Mechanical Engineer // $65,000
  17. Design Engineer // $65,000
  18. Applications Developer // $65,000
  19. Test Engineer // $65,000
  20. Programmer Analyst // $65,000
  21. Quality Engineer // $64,750
  22. Physical Therapist // $63,918
  23. Field Engineer // $63,750
  24. Project Engineer // $63,000
  25. Business Analyst // $63,000

SECTIONS

arrow
LIVE SMARTER