iStock
iStock

Dyslexia Doesn't Work the Way We Thought It Did

iStock
iStock

Dyslexia is not just about reading, or even language. It’s about something more fundamental: How much can the brain adapt to what it has just observed? People with dyslexia typically have less brain plasticity than those without dyslexia, two recent studies have found.

Though the studies measured people’s brain activity in two different ways and while performing different tasks, researchers at the Hebrew University of Israel, reporting in eLife, and researchers from MIT, reporting in Neuron, both found that dyslexics’ brains did not adapt as much to repeated stimuli, including spoken words, musical notes, and faces.

Both sets of researchers found that people with dyslexia more quickly forget recent events. This type of memory is called incidental or implicit memory, and includes anything you didn't know you needed to remember when it happened. Because of how quickly their implicit memory fades, dyslexics' brains don't adapt as much after reading or hearing something repeatedly—which is perhaps why it is harder for their brains to process the words they read.

Your brain generally benefits from repetition because it relates a stimulus to what you've already experienced—like a note you have heard before or a face you’ve seen. Researchers can see this by measuring brain response with electroencephalography (EEG), a noninvasive way of measuring electrical activity in the brain by attaching electrodes to your scalp. Measured by EEG, people’s brain responses decrease when they’ve heard a repeated note. The brain gets more efficient with repetition: It knows something about the note already, so it doesn’t have to work as hard to capture all of its details. It’s a bit like when you see an animal and recognize right away that it’s a dog without having to catalogue all of the things that make it a dog. Your brain is efficient at recognizing dogs because you’ve seen them before.

SHORTER MEMORIES AND LESS ADAPTABILITY

In the Hebrew University study, led by Merav Ahissar, researchers gave subjects a musical task: The researchers played two different notes and asked which was higher. Previous research has found that people do better on this task when one of the notes is a repeat of a note they’ve heard recently. But Ahissar found that people with dyslexia did not benefit as much from the repetition. When a tone was repeated only three seconds after the "anchor" tone, they got some benefit, but not after nine seconds had elapsed. And when Ahissar’s team measured dyslexic people’s brain responses with EEG, their brain responses didn’t decrease. Their brains didn’t get any more efficient—they were less adaptable.

The MIT study, led by John Gabrieli, found similar results through a different experiment. Gabrieli used functional magnetic resonance imaging (fMRI) to measure people’s brain activity by measuring changes in blood flow in their brains. Instead of asking people to discriminate between musical notes, Gabrieli's team simply presented people with repeated things, including spoken words, written words, faces, and common objects like tables or chairs. During this task, dyslexic people's neural activity demonstrated less adaptation.

“It was a big surprise for us,” Gabrieli tells mental_floss, “because people with reading disorders don't typically have any problems with faces or objects.” Next, Gabrieli is curious to look into whether the effects of dyslexia on brain plasticity are limited to hearing and vision, or whether they also extend to other senses like touch and smell.

Together, these studies build a better understanding of how dyslexia works, and because the two studies found the same result with different methods, their results are more convincing than a single study alone. But they also raise a new question: Why is dyslexia mainly noticeable in reading if it affects other types of memories as well?

READING IS NEW—AND HARD, FROM THE BRAIN'S PERSPECTIVE

One theory is that reading is simply a difficult task. “We have a long evolutionary history in our brains for recognizing objects, recognizing faces," Gabrieli points out. That's not the case for reading. “There’s hardly a bigger challenge for brain plasticity than learning to read." More evolutionary time has allowed the brain to evolve redundant ways of accomplishing the same thing. Perhaps people with dyslexia are better at compensating for the memory gap for recognizing faces and spoken words because the brain has more alternate pathways for these processes than it does for reading.

Both Ahissar and Gabrieli are most excited that this research opens up new ways of studying—and perhaps someday treating—dyslexia. If dyslexia is a condition of reading and language only, as previously believed, “we cannot study it in animals,” Ahissar tells mental_floss. On the other hand, if it’s a condition of brain plasticity, we can—in fact, plasticity has been extensively studied in animals, and neuroscientists know a lot about it.

Someday, Gabrieli says, it may even be possible to develop drugs that would treat dyslexia by promoting brain plasticity, although researchers would have to be careful both practically and ethically.

“We can’t imagine developing a drug that would enhance language directly—that's too complicated," he notes. "But brain plasticity is something that neuroscientists are making amazing progress on.”

nextArticle.image_alt|e
iStock
Heatwaves Can Affect Your Ability to Think Clearly and Make Decisions
iStock
iStock

Dehydration and body odor aren't the only things to hate about oppressive heat. According to new research reported by The Guardian, living through a heatwave without relief hampers your ability to think quickly and clearly.

For their study, published recently in PLOS Medicine, researchers at the Harvard T.H. Chan School of Public Health tested the mental performance of 44 students during a heatwave in Boston in 2016. Roughly half the students were living in newer dorm buildings with central AC, with the other half living in older dorms without it.

Over 12 days, researchers had participants take cognition tests on their phones immediately after waking up. The students living without AC took about 13 percent longer to respond to the questions and their answers were about 13 percent less accurate.

The results indicate that even if high temperatures don't pose an immediate threat to someone's health, they can impair them in other ways. “Most of the research on the health effects of heat has been done in vulnerable populations, such as the elderly, creating the perception that the general population is not at risk from heat waves,” Jose Guillermo Cedeño-Laurent, research fellow at Harvard Chan School and lead author of the study, said in a statement. “Knowing what the risks are across different populations is critical considering that in many cities, such as Boston, the number of heat waves is projected to increase due to climate change.”

Summers are gradually becoming hotter and longer in Boston—a trend that can be observed throughout most of the rest of the world thanks to the rising temperatures caused by human activity. In regions with historically cold winters, like New England, many buildings, including Harvard's oldest dorms, are built to retain heat, which can extend the negative effects of a heat wave even as the weather outside starts to cool. If temperatures continue to rise, we'll have to make a greater effort to keep people cool indoors, where American adults spend 90 percent of their time.

Our thinking isn't the only thing that suffers in the stifling heat. A study published last year found that hot weather does indeed make you crankier—which may not be as bad as bombing a test, but it's not exactly not fun for the people around you.

[h/t The Guardian]

nextArticle.image_alt|e
iStock
10 Mind-Boggling Psychiatric Treatments
iStock
iStock

by Dan Greenberg

Nobody ever claimed a visit to the doctor was a pleasant way to pass the time. But if you're timid about diving onto a psychiatrist's couch or paranoid about popping pills, remember: It could be worse. Like getting-a-hole-drilled-into-your-skull worse.

1. INSULIN COMA-THERAPY

The coma-therapy trend began in 1927. Viennese physician Manfred Sakel accidentally gave one of his diabetic patients an insulin overdose, and it sent her into a coma. But what could have been a major medical faux pas turned into a triumph. The woman, a drug addict, woke up and declared her morphine craving gone. Later, Sakel (who really isn't earning our trust here) made the same mistake with another patient—who also woke up claiming to be cured. Before long, Sakel was intentionally testing the therapy with other patients and reporting a 90 percent recovery rate, particularly among schizophrenics. Strangely, however, Sakel's treatment successes remain a mystery.

Presumably, a big dose of insulin causes blood sugar levels to plummet, which starves the brain of food and sends the patient into a coma. But why this unconscious state would help psychiatric patients is anyone's guess. Regardless, the popularity of insulin therapy faded, mainly because it was dangerous. Slipping into a coma is no walk in the park, and between one and two percent of treated patients died as a result.

2. TREPANATION

Ancient life was not without its hazards. Between wars, drunken duels, and the occasional run-in with an inadequately domesticated pig, it's no surprise that archaic skulls tend to have big holes in them. But not all holes are created with equal abandon. Through the years, archaeologists have uncovered skulls marked by a carefully cut circular gap, which shows signs of being made long before the owner of the head passed away. These fractures were no accident; they were the result of one of the earliest forms of psychiatric treatment called trepanation. The basic theory behind this "therapy" holds that insanity is caused by demons lurking inside the skull. As such, boring a hole into the patient's head creates a door through which the demons can escape, and—voila!—out goes the crazy.

Despite the peculiarity of the theory and lack of major-league anesthetics, trepanation was by no means a limited phenomenon. From the Neolithic era to the early 20th century, cultures all over the world used it as a way to cure patients of their ills. Doctors eventually phased out the practice as less invasive procedures were developed. Average Joes, on the other hand, didn't all follow suit. Trepanation patrons still exist. In fact, they even have their very own organizations, like the International Trepanation Advocacy Group.

3. ROTATIONAL THERAPY

Charles Darwin's grandfather Erasmus Darwin was a physician, philosopher, and scientist, but he wasn't particularly adept at any of the three. Consequently, his ideas weren't always taken seriously. Of course, this could be because he liked to record them in bad poetic verse (sample: "By immutable immortal laws / Impress'd in Nature by the great first cause, / Say, Muse! How rose from elemental strife / Organic forms, and kindled into life"). It could also be because his theories were a bit far-fetched, such as his spinning-couch treatment. Darwin's logic was that sleep could cure disease and that spinning around really fast was a great way to induce the slumber.

Nobody paid much attention to Darwin's idea at first, but later, American physician Benjamin Rush adapted the treatment for psychiatric purposes. He believed that spinning would reduce brain congestion and, in turn, cure mental illness. He was wrong. Instead, Rush just ended up with dizzy patients. These days, rotating chairs are limited to the study of vertigo and space sickness.

4. HYDROTHERAPY

If the word "hydrotherapy" conjures up images of Hollywood stars lazily soaking in rich, scented baths, then you probably weren't an early 20th-century psychiatric patient. Building off the idea that a dip in the water is often calming, psychiatrists of yore attempted to remedy various symptoms with corresponding liquid treatments. For instance, hyperactive patients got warm, tiring baths, while lethargic patients received stimulating sprays.

Some doctors, however, got a bit too zealous about the idea, prescribing therapies that sounded more like punishment than panacea. One treatment involved mummifying the patient in towels soaked in ice-cold water. Another required the patient to remain continuously submerged in a bath for hours or even days—which might not sound so bad, except they were strapped in and only allowed out to use the restroom. Finally, some doctors ordered the use of high-pressure jets. Sources indicate that at least one patient was strapped to the wall in the crucifixion position (never a good sign) and blasted with water from a fire hose. Like many extreme treatments, hydrotherapy was eventually replaced with psychiatric drugs, which tended to be more effective.

5. MESMERISM

Much like Yoda, Austrian physician Franz Mesmer (1734-1815) believed that an invisible force pervaded everything in existence, and that disruptions in this force caused pain and suffering. But Mesmer's ideas would have been of little use to Luke Skywalker. His basic theory was that the gravity of the moon affected the body's fluids in much the same way it caused ocean tides, and that some diseases accordingly waxed and waned with the phases of the moon. The dilemma, then, was to uncover what could be done about gravity's pernicious effects. Mesmer's solution: use magnets. After all, gravity and magnetism were both about objects being attracted to each other. Thus, placing magnets on certain areas of a patient's body might be able to counteract the disruptive influence of the moon's gravity and restore the normal flow of bodily fluids.

Surprisingly, many patients praised the treatment as a miracle cure, but the medical community dismissed it as superstitious hooey and chalked up his treatment successes to the placebo effect. Mesmer and his theories were ultimately discredited, but he still left his mark. Today, he's considered the father of modern hypnosis because of his inadvertent discovery of the power of suggestion, and his name lives on in the English word mesmerize

6. MALARIA THERAPY

Ah, if only we were talking about a therapy for malaria. Instead, this is malaria as therapy—specifically, as a treatment for syphilis. There was no cure for the STD until the early 1900s, when Viennese neurologist Wagner von Jauregg got the idea to treat syphilis sufferers with malaria-infected blood. Predictably, these patients would develop the disease, which would cause an extremely high fever that would kill the syphilis bacteria. Once that happened, they were given the malaria drug quinine, cured, and sent home happy and healthy. The treatment did have its share of side effects—that nasty sustained high fever, for one—but it worked, and it was a whole lot better than dying. In fact, Von Jauregg won the Nobel Prize for malaria therapy, and the treatment remained in use until the development of penicillin came along and gave doctors a better, safer way to cure the STD.

7. CHEMICALLY INDUCED SEIZURES

Nobody ever said doctors had flawless logic. A good example: seizure therapy. Hungarian pathologist Ladislas von Meduna pioneered the idea. He reasoned that, because schizophrenia was rare in epileptics, and because epileptics seemed blissfully happy after seizures, then giving schizophrenics seizures would make them calmer. In order to do this, von Meduna tested numerous seizure-inducing drugs (including such fun candidates as strychnine, caffeine, and absinthe) before settling on metrazol, a chemical that stimulates the circulatory and respiratory systems. And although he claimed the treatment cured the majority of his patients, opponents argued that the method was dangerous and poorly understood.

To this day, no one is quite clear on why seizures can help ease some schizophrenic symptoms, but many scientists believe the convulsions release chemicals otherwise lacking in patients' brains. Ultimately, the side effects (including fractured bones and memory loss) turned away both doctors and patients.

8. PHRENOLOGY

Around the turn of the 19th century, German physician Franz Gall developed phrenology, a practice based on the idea that people's personalities are depicted in the bumps and depressions of their skulls. Basically, Gall believed that the parts of the brain a person used more often would get bigger, like muscles. Consequently, these pumped-up areas would take up more skull space, leaving visible bumps in those places on your head. Gall then tried to determine which parts of the skull corresponded to which traits. For instance, bumps over the ears meant you were destructive; a ridge at the top of the head indicated benevolence; and thick folds on the back of the neck were sure signs of a sexually oriented personality. In the end, phrenologists did little to make their mark in the medical field, as they couldn't treat personality issues, only diagnose them (and inaccurately, at that). By the early 1900s, the fad had waned, and modern neuroscience had garnered dominion over the brain.

9. HYSTERIA THERAPY

Once upon a time, women suffering from pretty much any type of mental illness were lumped together as victims of hysteria. The Greek physician Hippocrates popularized the term, believing hysteria encompassed conditions ranging from nervousness to fainting fits to spontaneous muteness. The root cause, according to him, was a wandering womb. So, whither does it wander? Curious about Hippocrates's theory, Plato asked himself that very question. He claimed that if the uterus "remains unfruitful long beyond its proper time, it gets discontented and angry and wanders in every direction through the body, closes up the passages of the breath, and, by obstructing respiration, drives women to extremity." Consequently, cures for hysteria involved finding a way to "calm down" the uterus. And while there was no dearth of methods for doing this (including holding foul-smelling substances under the patient's nose to drive the uterus away from the chest), Plato believed the only surefire way to solve the problem was to get married and have babies. After all, the uterus always ended up in the right place when it came time to bear a child. Although "womb-calming" as a psychiatric treatment died out long ago, hysteria as a diagnosis hung around until the 20th century, when doctors began identifying conditions such as depression, post-traumatic stress disorder, and phobias.

10. LOBOTOMY

Dr. Walter Freeman, left, and Dr. James W. Watts study an X ray before a psychosurgical operation
Harris A Ewing, Saturday Evening Post, Public Domain, Wikimedia Commons

Everybody's favorite psychiatric treatment, the modern lobotomy was the brainchild of António Egas Moniz, a Portuguese doctor. Moniz believed that mental illnesses were generally caused by problems in the neurons of the frontal lobe, the part of the brain just behind the forehead. So when he heard about a monkey whose violent, feces-throwing urges had been curbed by cuts to the frontal lobe, Moniz was moved to try out the same thing with some of his patients. (The lobe-cutting, not the feces-throwing.) He believed the technique could cure insanity while leaving the rest of the patient's mental function relatively normal, and his (admittedly fuzzy) research seemed to support that. The accolades flooded in, and (in one of the lower points in the Karolinska Institute's history) Moniz was awarded the Nobel Prize in 1949.

After the lobotomy rage hit American shores, Dr. Walter Freeman took to traveling the country in his "lobotomobile" (no, really), performing the technique on everyone from catatonic schizophrenics to disaffected housewives. His road-ready procedure involved inserting a small ice pick into the brain through the eye socket and wiggling it around a bit. While some doctors thought he'd found a way to save hopeless cases from the horrors of life-long institutionalization, others noted that Freeman didn't bother with sterile techniques, had no surgical training whatsoever, and tended to be a bit imprecise when describing his patients' recovery.

As the number of lobotomies increased, a major problem became apparent: The patients weren't just calm—they were virtual zombies who scarcely responded to the world around them. Between that and the bad press lobotomies received in films and novels such as One Flew Over the Cuckoo's Nest, the treatment soon fell out of favor.

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios