CLOSE
Getty Images
Getty Images

11 Items You'll No Longer Find in Medicine Cabinets

Getty Images
Getty Images

Medicines and medical practices have come a long way in a relatively short time. Here are some items that have vanished from medicine cabinets in the last few decades.

1. Mercurochrome

Most folks under the age of 30 have never heard of this topical antiseptic. But many of us Boomers begged mom to daub our cuts and scrapes with the relatively painless Mercurochrome in lieu of that nasty stinging Iodine. Sure it stained your flesh pinkish-red, but you could probably wear that temporarily as a battle scar. The U.S. Food and Drug Administration put very strict limitations on the sale of Mercurochrome in 1998 and stated that it was no longer considered to be a GRAS (Generally Recognized As Safe) over-the-counter product. Many grandmothers scoffed, "Since when?! I used that stuff for years and none of my kids died!" But more scientific minds agreed that the ban was a wise and "about time!" decision, since the main active ingredient in Mercurochrome is mercury.

2. Iodine

Iodine burned like fire when applied to an open wound; this was mainly because the tincture sold for home use had an alcohol base. Many doctors today use a water-based iodine as an antiseptic, as it has one of the broadest germ-killing spectrums. The skull and crossbones on the label along with the word POISON in capital letters probably give a clue as to why this old school remedy is rarely found in home first aid kits anymore.

3. Mercury Thermometer


Before those convenient in-your-ear digital thermometers hit the market, we had to struggle to keep these heavy glass models under our tongues long enough for the mercury filling to register whether or not we were sick enough to stay home from school. My younger brother, Iron Jaws, bit through enough of these that Mom managed to collect a nice sized blob of mercury that she kept in a bottle for our amusement. Mercury thermometers are still available in the US (they've been banned throughout much of Europe and Asia), but the American Medical Association and the Environmental Protection Agency "strongly recommend" that alternative thermometers be used in the home.

4. Castor Oil

Once upon a time a bottle of vile-tasting castor oil was a staple in every medicine cabinet. For some reason, mothers in the 1920s and '30s used it as a cure-all for any sort of tummy ailment. In reality, the only condition castor oil is suitable to treat is constipation, and even in that case doctors tend to discourage its use, as the results are often unpredictable and can result in severe cramping and involuntary explosive bowel movements that last for hours.

5. TB Test


The skin test for tuberculosis was a common annual procedure for all elementary school children in the US during the 1940s, '50s, and '60s. The rate of infection decreased dramatically by the late 1970s, and universal TB testing gradually ceased. By the early 1990s, the American Academy of Pediatrics recommended testing for at-risk children (immigrants from Mexico, the Philippines, Vietnam, India, and China; kids exposed to IV drug users or adults with HIV) only. The benefits of targeted testing are proven, but implementing a procedure is difficult to do without stigmatizing the affected children, so in some school districts the program is currently in political limbo.

6. Disclosure Tablets


It used to be that once a year the school nurse, usually accompanied by a representative from Colgate or Crest, presented everyone in class with a packet containing a free toothbrush, a tiny tube of toothpaste, and two small red pills. The pills were disclosure tablets, and their purpose was to indicate the disgusting areas of your mouth where plaque was collecting and you needed to step up your brushing routine lest you end up with dentures in high school. The dental kits are rarely given away as a matter of routine today; thanks to today's litigious society, you usually have to ask your dentist or pharmacist for the tablets. That way they can ask all the proper questions ahead of time to make sure you (or your child) aren't allergic to anything in them or whether they violate your dietary restrictions (guess there weren't that many vegan kids back in the 1960s).

7. Fluoride


How many of you gagged just looking at that photo? For kids whose families were unable to afford to go to a dentist, public schools often offered a free fluoride treatment once per year. And even though we saw our dentist regularly, my mom couldn't pass up a freebie and always signed us up for the torture procedure. The fluoride was thick and syrupy and tasted terrible no matter what fun new flavor (like "bubblegum") they attempted to disguise it with. Luckily fluoridated water, toothpastes, rinses, and the like have virtually eliminated the need for additional special fluoride treatments.

8. Eye Patch for Amblyopia


Years ago the most popular treatment for "lazy eye" was a pirate-style eye patch worn over the good eye. Thanks to new treatments like specialized lenses and eye drops, patching is used only in a small percentage of cases these days. And, when patching the amblyopic eye is deemed necessary, doctors have discovered that an adhesive patch worn for a few hours daily is far more effective than the Moshe Dayan model.

9. Nurse's Cap

Remember how the mere sight of the nurse entering the examining room in her starched white uniform with the cap perched atop her head was enough to make you break into a flop sweat as a kid? Forget about "white coat syndrome," that severe uniform made every woman look like Nurse Ratched and sent many a patient into panic mode. Nurses ditched the white dresses and pantyhose in the 1980s in favor of colorful and whimsical scrubs, which were both more practical and comfortable for the wearer and more relaxing for the patient. And while the cap was the iconic symbol of nursing (nursing students were presented their caps with great ceremony upon graduation), it was also extremely unhygienic; even with multiple bobby pins, the hat rarely stayed in place, forcing the wearer to constantly fuss with it, touching her hair and contaminating her hands. Today's nursing school graduates receive pins instead of caps.

10. Head Mirror

Old-time movie and TV doctors always wore their head mirrors over their foreheads, like a shiny bullseye. In practice, however, the mirror (which was invented in the mid-1800s) was worn over one eye so that the doctor could peep through the tiny hole in the middle. The rest of the disc reflected an overhead light (or even the sunlight) onto the area of the patient the doctor was examining. Positioning the mirror just so literally took hours of practice, and most doctors today use a battery-operated head light instead. Some otorhinolaryngologists still prefer the mirror, though, believing that it provides the best light for indirect laryngeal examination.

11. Iron Lung


Dr. Philip Drinker of the Harvard School of Public Health developed the first "thoracic cage" that used vacuum cleaner blowers to alternate between atmospheric and sub-atmospheric pressure to force a patient to breathe. The machine, known as a Drinker Respirator, was originally intended as a pediatric-ward device to assist premature babies born with under-developed lungs. But when the dreaded disease known as polio began to spread in the United States, doctors found a second use for the device. Polio frequently paralyzed patients' diaphragms, rendering them unable to breathe on their own. The Warren Collins Corporation fine-tuned Drinker's design and mass-produced a similar device at a more affordable price; it was dubbed the Iron Lung. In the early 1950s, most hospitals had wards filled with iron lungs, and many homes had a polio patient encased in one as well. Today's patients who are unable to breathe on their own are intubated with positive pressure ventilators, as opposed to the negative pressure utilized by the iron lung of yesteryear.
***
What home remedies do you remember grandma or mom using on you when you were a kid? What medical device that scared you to death has been replaced by a kinder, gentler gadget? Share both your horror stories and warm fuzzies!

nextArticle.image_alt|e
iStock
arrow
Health
The First Shot to Stop Chronic Migraines Just Secured FDA Approval
iStock
iStock

Migraine sufferers unhappy with current treatments will soon have a new option to consider. Aimovig, a monthly shot, just received approval from the Food and Drug Administration and is now eligible for sale, CBS News reports. The shot is the first FDA-approved drug of its kind designed to stop migraines before they start and prevent them over the long term.

As Mental Floss reported back in February before the drug was cleared, the new therapy is designed to tackle a key component of migraine pain. Past studies have shown that levels of a protein called calcitonin gene–related peptide (CGRP) spike in chronic sufferers when they're experiencing the splitting headaches. In clinical trials, patients injected with the CGRP-blocking medicine in Aimovig saw their monthly migraine episodes cut in half (from eight a month to just four). Some subjects reported no migraines at all in the month after receiving the shot.

Researchers have only recently begun to untangle the mysteries of chronic migraine treatment. Until this point, some of the best options patients had were medications that weren't even developed to treat the condition, like antidepressants, epilepsy drugs, and Botox. In addition to yielding spotty results, many of these treatments also come with severe side effects. The most serious side effects observed in the Aimovig studies were colds and respiratory infections.

Monthly Aimovig shots will cost $6900 a year without insurance. Now that the drug has been approved, a flood of competitors will likely follow: This year alone, three similar shots are expected to receive FDA clearance.

[h/t CBS News]

nextArticle.image_alt|e
iStock
arrow
Medicine
The 98.6℉ Myth: Why Everything You Think You Know About Body Temperature Is a Lie
iStock
iStock

When you were kid, you probably knew that to score a magical sick day home from school, you needed to have a fever. When the thermometer came out of your mouth, it had to read higher than 98.6℉—the long-accepted "normal" human body temperature. (If you wanted to really seal the deal, you may have hoped to hit 100℉.) Since then, you may have used a temperature above 98.6℉ as a metric to work from home (or call out sick entirely).

But here's the thing: The average body temperature isn't actually 98.6℉—a fact that we've known for more than 25 years. The myth originated in the 19th century with a single doctor, and despite evidence to the contrary, it's persisted ever since.

THE GIANT—AND FAULTY—ARMPIT THERMOMETER

In 1851, Carl Wunderlich, the director of the hospital at Leipzig University, began going from room to room with a comically large thermometer in tow. He wanted to understand how body temperature is affected by different diseases, so in each room, he would hold the foot-long device in patients' armpits for a full 20 minutes, waiting for a temperature to register. Once it did, he'd note the temperature on the patient's chart (Wunderlich is thought to be the first physician to do so). He and his staff did this for years, repeatedly taking the temperatures of some 25,000 patients and logging them on their charts, until he had millions of readings. In 1868, he finally published this data in Das Verhalten der Eigenwarme in Krankheiten (On the Temperature in Diseases: A Manual of Medical Thermometry). He concluded that the average human body temperature was 98.6℉, underscoring the idea that fever is a symptom of illness, not a cause.

No one questioned Wunderlich's methods, or his average, for about 140 years. Then, in the early 1990s, internist Philip Mackowiak—a professor of medicine at the University of Maryland, a medical historian, and, apparently, a clinical thermometer junkie—saw one of the physician's instruments at the Mutter Museum in Philadelphia. He told the Freakonomics podcast that he'd always had doubts about the 98.6℉ standard. "I am by nature a skeptic," he said. "And it occurred to me very early in my career that this idea that 98.6 was normal, and then if you didn't have a temperature of 98.6, you were somehow abnormal, just didn't sit right."

Getting his hands on Wunderlich's thermometer—which the museum let him borrow—only deepened his doubts. The huge thermometer was unwieldy and non-registering, meaning, Mackowiak explained, "that it has to be read while it's in place." Not only that, but Wunderlich had used the device to measure temperatures in the armpit, which is less reliable than temperatures taken in the mouth or rectum. The instrument itself also wasn't terribly precise: It measured up to 2 degrees Centigrade higher than both ancient and modern instruments.

In 1992, Mackowiak decided to test Wunderlich's average. Using normal-sized oral thermometers and a group of volunteers, he determined that the average human body temperature actually hovers around 98.2℉. Mackowiak found that body temperature tends to vary over the course of the day, with its lowest point around 6 a.m. and its highest in the early evening. Body temperature can also fluctuate monthly (with the menstrual cycle) and over a lifetime (declining decade by decade with age), and may even be differentially linked to sex and race assignments. He concluded that normal body temperature is so unique to each person that it's almost like a fingerprint and, given that wide variation, not actually a very reliable indicator of illness.

As a result of his study, Mackowiak proposed raising the threshold for fever to 98.9℉ for temperatures taken in the morning (and 99.9℉ at other times). While it's a relatively minor change in terms of actual degrees, this fever threshold is actually lower than the CDC's, which is a temperature of 100.4℉ or higher.

There are potential real-life consequences in this gap, for everyone from students (who'd have to attend school with what would be considered a low-grade fever by Wunderlich's 98.6℉ standard) to employers and daycares (who use temperature to set attendance policies). What's more, anyone who is actually sick but ignores a low-grade fever—one that meets Mackowiak's threshold but still falls under the CDC's—could pose a risk to people with compromised immune systems trying to avoid unnecessary exposure to illness in public places.

THE BALANCING POINT

There's a reason the average trends near 98℉ instead of 92℉ or 106℉. As endotherms, mammals expend a great deal of energy maintaining body temperature when compared with cold-blooded creatures. To find and conserve a just-right body temperature, central nervous system sensors gather data (too warm? too cold? just right, Goldilocks?) and send that information to the pebble-sized hypothalamus near the base of the brain. There, the data is converted into action: releasing sweat and widening the blood vessels if too warm; raising metabolism, constricting the blood vessels, and inducing shivering if too cold.

According to a study by Aviv Bergman and Arturo Casadevall in the journal mBio, the precise balancing point for ideal body temperature is the sweet spot where the metabolic cost for all this thermoregulation balances with the evolutionary advantage of warding off fungal disease. (While warm-blooded animals are prone to bacterial or viral infections, they rarely experience fungal infections because most fungi can't withstand temperatures above 86℉. Cold-blooded animals, on the other hand, are prone to all three.) For Bergman and Casadevall, this benefit even explains what tipped Darwin's scales in favor of mammals, allowing them to edge out other vertebrates for dominance after the Cretaceous-Tertiary mass extinction wiped out the dinosaurs.

Of course, rules call for exceptions, and the one place where human body temperature demonstrates sustained elevation is outer space. Astronauts on prolonged missions clock significantly higher average body temperatures than they do when terrestrial—even up to 104℉. This so-called "space fever" is probably a product of some combination of radiation exposure, psychological stress, and immune response to weightlessness. Researchers believe this phenomenon could yield crucial information about thermoregulation—and may even offer insight into how humans might adapt to climate change.

WHY THE MYTH PERSISTS

It's been 26 years since Mackowiak's study, yet the newer data has not taken hold among medical professionals or the public. What gives?

Mackowiak tells Mental Floss that he finds it a bit mystifying that the myth persists, especially since many people, when pressed, know that the so-called "average" temperature varies. Part of the problem may be psychological: We cling to beliefs despite evidence to the contrary—a phenomenon called belief perseverance [PDF]. It's a significant force upholding a surprising number of medical myths. The idea humans should drink eight glasses of water a day? Not science. Sugar causes hyperactive behavior? Nope. Reading in dim light harms eyesight? Not really.

Unlearning persistent myths—especially ones loaded with the weight of medical authority—is difficult. "Deep down, under it all," Mackowiak says, "people want simple answers for things."

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios