10 Mind-Boggling Psychiatric Treatments

iStock
iStock

by Dan Greenberg

Nobody ever claimed a visit to the doctor was a pleasant way to pass the time. But if you're timid about diving onto a psychiatrist's couch or paranoid about popping pills, remember: It could be worse. Like getting-a-hole-drilled-into-your-skull worse.

1. INSULIN COMA-THERAPY

The coma-therapy trend began in 1927. Viennese physician Manfred Sakel accidentally gave one of his diabetic patients an insulin overdose, and it sent her into a coma. But what could have been a major medical faux pas turned into a triumph. The woman, a drug addict, woke up and declared her morphine craving gone. Later, Sakel (who really isn't earning our trust here) made the same mistake with another patient—who also woke up claiming to be cured. Before long, Sakel was intentionally testing the therapy with other patients and reporting a 90 percent recovery rate, particularly among schizophrenics. Strangely, however, Sakel's treatment successes remain a mystery.

Presumably, a big dose of insulin causes blood sugar levels to plummet, which starves the brain of food and sends the patient into a coma. But why this unconscious state would help psychiatric patients is anyone's guess. Regardless, the popularity of insulin therapy faded, mainly because it was dangerous. Slipping into a coma is no walk in the park, and between one and two percent of treated patients died as a result.

2. TREPANATION

Ancient life was not without its hazards. Between wars, drunken duels, and the occasional run-in with an inadequately domesticated pig, it's no surprise that archaic skulls tend to have big holes in them. But not all holes are created with equal abandon. Through the years, archaeologists have uncovered skulls marked by a carefully cut circular gap, which shows signs of being made long before the owner of the head passed away. These fractures were no accident; they were the result of one of the earliest forms of psychiatric treatment called trepanation. The basic theory behind this "therapy" holds that insanity is caused by demons lurking inside the skull. As such, boring a hole into the patient's head creates a door through which the demons can escape, and—voila!—out goes the crazy.

Despite the peculiarity of the theory and lack of major-league anesthetics, trepanation was by no means a limited phenomenon. From the Neolithic era to the early 20th century, cultures all over the world used it as a way to cure patients of their ills. Doctors eventually phased out the practice as less invasive procedures were developed. Average Joes, on the other hand, didn't all follow suit. Trepanation patrons still exist. In fact, they even have their very own organizations, like the International Trepanation Advocacy Group.

3. ROTATIONAL THERAPY

Charles Darwin's grandfather Erasmus Darwin was a physician, philosopher, and scientist, but he wasn't particularly adept at any of the three. Consequently, his ideas weren't always taken seriously. Of course, this could be because he liked to record them in bad poetic verse (sample: "By immutable immortal laws / Impress'd in Nature by the great first cause, / Say, Muse! How rose from elemental strife / Organic forms, and kindled into life"). It could also be because his theories were a bit far-fetched, such as his spinning-couch treatment. Darwin's logic was that sleep could cure disease and that spinning around really fast was a great way to induce the slumber.

Nobody paid much attention to Darwin's idea at first, but later, American physician Benjamin Rush adapted the treatment for psychiatric purposes. He believed that spinning would reduce brain congestion and, in turn, cure mental illness. He was wrong. Instead, Rush just ended up with dizzy patients. These days, rotating chairs are limited to the study of vertigo and space sickness.

4. HYDROTHERAPY

If the word "hydrotherapy" conjures up images of Hollywood stars lazily soaking in rich, scented baths, then you probably weren't an early 20th-century psychiatric patient. Building off the idea that a dip in the water is often calming, psychiatrists of yore attempted to remedy various symptoms with corresponding liquid treatments. For instance, hyperactive patients got warm, tiring baths, while lethargic patients received stimulating sprays.

Some doctors, however, got a bit too zealous about the idea, prescribing therapies that sounded more like punishment than panacea. One treatment involved mummifying the patient in towels soaked in ice-cold water. Another required the patient to remain continuously submerged in a bath for hours or even days—which might not sound so bad, except they were strapped in and only allowed out to use the restroom. Finally, some doctors ordered the use of high-pressure jets. Sources indicate that at least one patient was strapped to the wall in the crucifixion position (never a good sign) and blasted with water from a fire hose. Like many extreme treatments, hydrotherapy was eventually replaced with psychiatric drugs, which tended to be more effective.

5. MESMERISM

Much like Yoda, Austrian physician Franz Mesmer (1734-1815) believed that an invisible force pervaded everything in existence, and that disruptions in this force caused pain and suffering. But Mesmer's ideas would have been of little use to Luke Skywalker. His basic theory was that the gravity of the moon affected the body's fluids in much the same way it caused ocean tides, and that some diseases accordingly waxed and waned with the phases of the moon. The dilemma, then, was to uncover what could be done about gravity's pernicious effects. Mesmer's solution: use magnets. After all, gravity and magnetism were both about objects being attracted to each other. Thus, placing magnets on certain areas of a patient's body might be able to counteract the disruptive influence of the moon's gravity and restore the normal flow of bodily fluids.

Surprisingly, many patients praised the treatment as a miracle cure, but the medical community dismissed it as superstitious hooey and chalked up his treatment successes to the placebo effect. Mesmer and his theories were ultimately discredited, but he still left his mark. Today, he's considered the father of modern hypnosis because of his inadvertent discovery of the power of suggestion, and his name lives on in the English word mesmerize

6. MALARIA THERAPY

Ah, if only we were talking about a therapy for malaria. Instead, this is malaria as therapy—specifically, as a treatment for syphilis. There was no cure for the STD until the early 1900s, when Viennese neurologist Wagner von Jauregg got the idea to treat syphilis sufferers with malaria-infected blood. Predictably, these patients would develop the disease, which would cause an extremely high fever that would kill the syphilis bacteria. Once that happened, they were given the malaria drug quinine, cured, and sent home happy and healthy. The treatment did have its share of side effects—that nasty sustained high fever, for one—but it worked, and it was a whole lot better than dying. In fact, Von Jauregg won the Nobel Prize for malaria therapy, and the treatment remained in use until the development of penicillin came along and gave doctors a better, safer way to cure the STD.

7. CHEMICALLY INDUCED SEIZURES

Nobody ever said doctors had flawless logic. A good example: seizure therapy. Hungarian pathologist Ladislas von Meduna pioneered the idea. He reasoned that, because schizophrenia was rare in epileptics, and because epileptics seemed blissfully happy after seizures, then giving schizophrenics seizures would make them calmer. In order to do this, von Meduna tested numerous seizure-inducing drugs (including such fun candidates as strychnine, caffeine, and absinthe) before settling on metrazol, a chemical that stimulates the circulatory and respiratory systems. And although he claimed the treatment cured the majority of his patients, opponents argued that the method was dangerous and poorly understood.

To this day, no one is quite clear on why seizures can help ease some schizophrenic symptoms, but many scientists believe the convulsions release chemicals otherwise lacking in patients' brains. Ultimately, the side effects (including fractured bones and memory loss) turned away both doctors and patients.

8. PHRENOLOGY

Around the turn of the 19th century, German physician Franz Gall developed phrenology, a practice based on the idea that people's personalities are depicted in the bumps and depressions of their skulls. Basically, Gall believed that the parts of the brain a person used more often would get bigger, like muscles. Consequently, these pumped-up areas would take up more skull space, leaving visible bumps in those places on your head. Gall then tried to determine which parts of the skull corresponded to which traits. For instance, bumps over the ears meant you were destructive; a ridge at the top of the head indicated benevolence; and thick folds on the back of the neck were sure signs of a sexually oriented personality. In the end, phrenologists did little to make their mark in the medical field, as they couldn't treat personality issues, only diagnose them (and inaccurately, at that). By the early 1900s, the fad had waned, and modern neuroscience had garnered dominion over the brain.

9. HYSTERIA THERAPY

Once upon a time, women suffering from pretty much any type of mental illness were lumped together as victims of hysteria. The Greek physician Hippocrates popularized the term, believing hysteria encompassed conditions ranging from nervousness to fainting fits to spontaneous muteness. The root cause, according to him, was a wandering womb. So, whither does it wander? Curious about Hippocrates's theory, Plato asked himself that very question. He claimed that if the uterus "remains unfruitful long beyond its proper time, it gets discontented and angry and wanders in every direction through the body, closes up the passages of the breath, and, by obstructing respiration, drives women to extremity." Consequently, cures for hysteria involved finding a way to "calm down" the uterus. And while there was no dearth of methods for doing this (including holding foul-smelling substances under the patient's nose to drive the uterus away from the chest), Plato believed the only surefire way to solve the problem was to get married and have babies. After all, the uterus always ended up in the right place when it came time to bear a child. Although "womb-calming" as a psychiatric treatment died out long ago, hysteria as a diagnosis hung around until the 20th century, when doctors began identifying conditions such as depression, post-traumatic stress disorder, and phobias.

10. LOBOTOMY

Dr. Walter Freeman, left, and Dr. James W. Watts study an X ray before a psychosurgical operation
Harris A Ewing, Saturday Evening Post, Public Domain, Wikimedia Commons

Everybody's favorite psychiatric treatment, the modern lobotomy was the brainchild of António Egas Moniz, a Portuguese doctor. Moniz believed that mental illnesses were generally caused by problems in the neurons of the frontal lobe, the part of the brain just behind the forehead. So when he heard about a monkey whose violent, feces-throwing urges had been curbed by cuts to the frontal lobe, Moniz was moved to try out the same thing with some of his patients. (The lobe-cutting, not the feces-throwing.) He believed the technique could cure insanity while leaving the rest of the patient's mental function relatively normal, and his (admittedly fuzzy) research seemed to support that. The accolades flooded in, and (in one of the lower points in the Karolinska Institute's history) Moniz was awarded the Nobel Prize in 1949.

After the lobotomy rage hit American shores, Dr. Walter Freeman took to traveling the country in his "lobotomobile" (no, really), performing the technique on everyone from catatonic schizophrenics to disaffected housewives. His road-ready procedure involved inserting a small ice pick into the brain through the eye socket and wiggling it around a bit. While some doctors thought he'd found a way to save hopeless cases from the horrors of life-long institutionalization, others noted that Freeman didn't bother with sterile techniques, had no surgical training whatsoever, and tended to be a bit imprecise when describing his patients' recovery.

As the number of lobotomies increased, a major problem became apparent: The patients weren't just calm—they were virtual zombies who scarcely responded to the world around them. Between that and the bad press lobotomies received in films and novels such as One Flew Over the Cuckoo's Nest, the treatment soon fell out of favor.

A Dracula Ant's Jaws Snap at 200 Mph—Making It the Fastest Animal Appendage on the Planet

Ant Lab, YouTube
Ant Lab, YouTube

As if Florida’s “skull-collecting” ants weren’t terrifying enough, we’re now going to be having nightmares about Dracula ants. A new study in the journal Royal Society Open Science reveals that a species of Dracula ant (Mystrium camillae), which is found in Australia and Southeast Asia, can snap its jaws shut at speeds of 90 meters per second—or the rough equivalent of 200 mph. This makes their jaws the fastest part of any animal on the planet, researchers said in a statement.

These findings come from a team of three researchers that includes Adrian Smith, who has also studied the gruesome ways that the skull-collecting ants (Formica archboldi) dismember trap-jaw ants, which were previously considered to be the fastest ants on record. But with jaw speeds of just over 100 miles per hour, they’re no match for this Dracula ant. (Fun fact: The Dracula ant subfamily is named after their habit of drinking the blood of their young through a process called "nondestructive cannibalism." Yikes.)

Senior author Andrew Suarez, of the University of Illinois, said the anatomy of this Dracula ant’s jaw is unusual. Instead of closing their jaws from an open position, which is what trap-jaw ants do, they use a spring-loading technique. The ants “press the tips of their mandibles together to build potential energy that is released when one mandible slides across the other, similar to a human finger snap,” researchers write.

They use this maneuver to smack other arthropods or push them away. Once they’re stunned, they can be dragged back to the Dracula ant’s nest, where the unlucky victims will be fed to Dracula ant larvae, Suarez said.

Researchers used X-ray imaging to observe the ants’ anatomy in three dimensions. High-speed cameras were also used to record their jaws snapping at remarkable speeds, which measure 5000 times faster than the blink of a human eye. Check out the ants in slow-motion in the video below.

14 Facts About Celiac Disease

iStock.com/fcafotodigital
iStock.com/fcafotodigital

Going gluten-free may be a modern diet trend, but people have been suffering from celiac disease—a chronic condition characterized by gluten intolerance—for centuries. Patients with celiac are ill-equipped to digest products made from certain grains containing gluten; wheat is the most common. In the short-term this can cause gastrointestinal distress, and in the long-term it can foster symptoms associated with early death.

Celiac diagnoses are more common than ever, which also means awareness of how to live with the condition is at an all-time high. Here are some things you might not know about celiac disease symptoms and treatments.

1. Celiac an autoimmune disease.

The bodies of people with celiac have a hostile reaction to gluten. When the protein moves through the digestive tract, the immune system responds by attacking the small intestine, causing inflammation that damages the lining of the organ. As this continues over time, the small intestine has trouble absorbing nutrients from other foods, which can lead to additional complications like anemia and osteoporosis.

2. You can get celiac disease from your parents.

Nearly all cases of celiac disease arise from certain variants of the genes HLA-DQA1 and HLA-DQB1. These genes help produce proteins in the body that allow the immune system to identify potentially dangerous foreign substances. Normally the immune system wouldn't label gliadin, a segment of the gluten protein, a threat, but due to mutations in these genes, the bodies of people with celiac treat gliadin as a hostile invader.

Because it's a genetic disorder, people with a first-degree relative (a sibling, parent, or child) with celiac have a 4 to 15 percent chance of having it themselves. And while almost all patients with celiac have these specific HLA-DQA1 and HLA-DQB1 variations, not everyone with the mutations will develop celiac. About 30 percent of the population has these gene variants, and only 3 percent of that group goes on to develop celiac disease.

3. Makeup might contribute to celiac disease symptoms.

People with celiac disease can’t properly process gluten, the protein naturally found in the grains like wheat, rye, and barley. Patients have to follow strict dietary guidelines and avoid most bread, pasta, and cereal, in order to manage their symptoms. But gluten isn’t limited to food products: It can also be found in some cosmetics. While makeup containing gluten causes no issues for many people with celiac, it can provoke rashes in others or lead to more problems if ingested. For those folks, gluten-free makeup is an option.

4. The name comes from 1st-century Greece.

A 1st-century Greek physician named Aretaeus of Cappadocia may have been the first person to describe celiac disease symptoms in writing [PDF]. He named it koiliakos after the Greek word koelia for abdomen, and he referred to people with the condition as coeliacs. In his description he wrote, “If the stomach be irretentive of the food and if it pass through undigested and crude, and nothing ascends into the body, we call such persons coeliacs.”

5. There are nearly 300 celiac disease symptoms.

Celiac disease may start in the gut, but it can be felt throughout the whole body. In children, the condition usually manifests as bloating, diarrhea, and abdominal discomfort, but as patients get older they start to experience more “non-classical” symptoms like anemia, arthritis, and fatigue. There are at least 281 symptoms associated with celiac disease, many of which overlap with other conditions and make celiac hard to diagnose. Other common symptoms of the disease include tooth discoloration, anxiety and depression, loss of fertility, and liver disorders. Celiac patients also have a greater chance of developing an additional autoimmune disorder, with the risk increasing the later in life the initial condition is diagnosed.

6. Some patients show no symptoms at all.

It’s not uncommon for celiac disease to be wrecking a patient’s digestive tract while showing no apparent symptoms. This form of the condition, sometimes called asymptomatic or “silent celiac disease,” likely contributes to part of the large number of people with celiac who are undiagnosed. People who are at high risk for the disease (the children of celiac sufferers, for example), or who have related conditions like type 1 diabetes and Down syndrome (both conditions that put patients at a greater risk for developing new autoimmune diseases) are encouraged to get tested for it even if they aren’t showing any signs.

7. It’s not the same as wheat sensitivity.

Celiac is often confused with wheat sensitivity, a separate condition that shares many symptoms with celiac, including gastrointestinal issues, depression, and fatigue. It’s often called gluten sensitivity or gluten intolerance, but because doctors still aren’t sure if gluten is the cause, many refer to it as non-celiac wheat sensitivity. There’s no test for it, but patients are often treated with the same gluten-free diet that’s prescribed to celiac patients.

8. It's not a wheat allergy either.

Celiac disease is often associated with wheat because it's one of the more common products containing gluten. While it's true that people with celiac can't eat wheat, the condition isn't a wheat allergy. Rather than reacting to the wheat, patients react to a specific protein that's found in the grain as well as others.

9. It can develop at any age.

Just because you don’t have celiac now doesn’t mean you’re in the clear for life: The disease can develop at any age, even in people who have tested negative for it previously. There are, however, two stages of life when symptoms are most likely to appear: early childhood (8 to 12 months) and middle adulthood (ages 40 to 60). People already genetically predisposed to celiac become more susceptible to it when the composition of their intestinal bacteria changes as they get older, either as a result of infection, surgery, antibiotics, or stress.

10. Not all grains are off-limits.

A gluten-free diet isn’t necessarily a grain-free diet. While it’s true that the popular grains wheat, barley, and rye contain gluten, there are plenty of grains and seeds that don’t and are safe for people with celiac to eat. These include quinoa, millet, amaranth, buckwheat, sorghum, and rice. Oats are also naturally gluten-free, but they're often contaminated with gluten during processing, so consumers with celiac should be cautious when buying them.

11. Celiac disease can be detected with a blood test.

Screenings for celiac disease used to be an involved process, with doctors monitoring patients’ reactions to their gluten-free diet over time. Today all it takes is a simple test to determine whether someone has celiac. People with the condition will have anti-tissue transglutaminase antibodies in their bloodstream. If a blood test confirms the presence of these proteins in a patient, doctors will then take a biopsy of their intestine to confirm the root cause.

12. The gluten-free diet doesn’t work for all patients.

Avoiding gluten is the most effective way to manage celiac disease, but the treatment doesn’t work 100 percent of the time. In up to a fifth of patients, the damaged intestinal lining does not recover even a year after switching to a gluten-free diet. Most cases of non-responsive celiac disease can be explained by people not following the diet closely enough, or by having other conditions like irritable bowel syndrome, lactose intolerance, or small intestine bacterial overgrowth that impede recovery. Just a small fraction of celiac disease sufferers don’t respond to a strict gluten-free diet and have no related conditions. These patients are usually prescribed steroids and immunosuppressants as alternative treatments.

13. If you don’t have celiac, gluten probably won’t hurt you.

The gluten-free diet trend has exploded in popularity in recent years, and most people who follow it have no medical reason to do so. Going gluten-free has been purported to do everything from help you lose weight to treat autism—but according to doctors, there’s no science behind these claims. Avoiding gluten may help some people feel better and more energetic because it forces them to cut heavily processed junk foods out of their diet. In such cases it’s the sugar and carbs that are making people feel sluggish—not the gluten protein. If you don’t have celiac or a gluten sensitivity, most experts recommend saving yourself the trouble by eating healthier in general rather than abstaining from gluten.

14. The numbers are growing.

A 2009 study found that four times as many people have celiac today than in the 1950s, and the spike can’t be explained by increased awareness alone. Researchers tested blood collected at the Warren Air Force Base between 1948 and 1954 and compared them to fresh samples from candidates living in one Minnesota county. The results supported the theory that celiac has become more prevalent in the last half-century. While experts aren’t exactly sure why the condition is more common today, it may have something to do with changes in how wheat is handled or the spread of gluten into medications and processed foods.

SECTIONS

arrow
LIVE SMARTER