Being Infected With Malaria Helped Ebola Victims to Survive

Ebola survivor James Harris, 29, stands for a portrait before a shift as a nurse's assistant at the Doctors Without Borders (MSF), Ebola treatment center on October 12, 2014 in Paynesville, Liberia. Image credit: John Moore/Getty Images

A recent study revealed a surprising finding: Of those infected in the West African Ebola epidemic in 2014, patients who had an active malaria parasite infection were actually more likely to survive the Ebola virus, and by a significant degree. While just over half (52 percent) of Ebola patients not infected with malaria survived, those co-infected with malaria had a survival rate of 72 to 83 percent, depending on their ages and the amount of Ebola virus in their blood.

What gives? Shouldn’t having a second, potentially deadly infection make you more likely to die of Ebola? 

Maybe not. Though researchers aren’t yet sure of the mechanism by which malaria co-infection in Ebola patients might be protective, they have some ideas. The prevailing thought is that malaria is somehow modifying the immune response to Ebola, making it less deadly than in people who aren’t co-infected with the malaria parasite.

The authors of the study, published in the journal Clinical Infectious Diseases, note that malaria can make other infections less deadly. For example, in a group of children from Tanzania, those who had respiratory infections along with malaria were less likely to have those infections develop into pneumonia than kids who had respiratory infections without it.

It may be that malaria is able to tone down a phenomenon called the “cytokine storm”—the body’s own response to an Ebola infection that inadvertently kills the host while attempting to eliminate the pathogen. If malaria can turn this host response down, patients may have a better chance of surviving the virus’s assault.

This wouldn’t be the first time that malaria infection has been hailed as a hero, rather than an enemy. In 1927, the Nobel Prize in Physiology or Medicine was awarded to Julius Wagner-Jauregg “for his discovery of the therapeutic value of malaria inoculation in the treatment of dementia paralytica.” Wagner-Juaregg and others had observed that sometimes syphilis seemed to be cured following “febrile infectious diseases” as far back as 1887. He also noted in his Nobel speech that he had “singled out as a particular advantage of malaria that there is the possibility of interrupting the disease at will by the use of quinine, but I did not then anticipate to what degree these expectations from induced malaria would be fulfilled.” While there was no “cure” for syphilis at the time, and no cure for the other infection he had considered (erysipelas, usually caused by the same bacterium that causes strep throat and scarlet fever), malaria could be treated with quinine, a compound that we still use today.

Before Wagner-Juaregg’s “malariotherapy,” treatments for syphilis included mercury, Salvarsan (an arsenic-containing drug), and bismuth—all of which had serious side effects, including death. Wagner-Juaregg’s methods seemed to have no more risks than the conventional treatments of the era, and in 1917, he injected nine individuals suffering from advanced syphilis with malaria parasites. He reported three of them to be cured, and three more to have “extensive remission.” Soon, malariotherapy spread across the U.S. and into Europe, with tens of thousands of syphilis patients treated with the malaria parasite.

However, the degree to which malariotherapy worked is still a matter of controversy. And it was not without its own serious side effects, with death resulting in up to 15 percent of those treated. With the introduction of penicillin as a treatment for syphilis in the 1940s, malariotherapy was replaced, but the decades of use of malaria as a treatment significantly advanced our knowledge of the malaria parasite.

Today, scientists may be able to use this natural experiment to create drugs that could mimic malaria’s effect without actively infecting individuals. (Malaria is a devastating disease, causing hundreds of thousands of deaths every year, primarily in Africa.) Animal models could potentially be used to tease apart the host’s response to Ebola infection and determine how malaria alters the usual response to the Ebola virus to make it less deadly. These alterations could be used to create new drugs or other interventions to treat Ebola infection.

More importantly, further study of the phenomenon of malaria co-infection with other pathogens could lead to changes in patient care. The current standard operating procedure is to treat malaria infection when it is found in an Ebola case. But might it actually improve a patient's outcome to delay treatment for malaria? The authors of the current study note that a mouse model of malaria-Ebola co-infection found that treatment for malaria led to death from Ebola infection in all animals. And yet during the 2014 Ebola outbreak, work carried out at one Ebola treatment center in Liberia showed that Ebola fatality rates decreased with effective malaria treatment. Complicating the matter, the malarial drug used in that case (artesunate-amodiaquine, or ASAQ) may have been responsible for the anti-Ebola activity.

While it’s unlikely that a malaria treatment for Ebola would be as popular (or legal or ethical) as the “malariotherapy” of the early 1900s, it’s certainly worth closely examining the clues this co-infection has provided scientists about the nature of both Ebola and malaria infections—and how we could harness them to fight against one of nature’s most frightening diseases.

Emery Smith
Stones, Bones, and Wrecks
The 'Alien' Mummy Is of Course Human—And Yet, Still Unusual
Emery Smith
Emery Smith

Ata has never been an alien, but she's always been an enigma. Discovered in 2003 in a leather pouch near an abandoned mining town in Chile's Atacama Desert, the tiny, 6-inch mummy's unusual features—including a narrow, sloped head, angled eyes, missing ribs, and oddly dense bones—had both the “It's aliens!” crowd and paleopathologists intrigued. Now, a team of researchers from Stanford University School of Medicine and UC-San Francisco has completed a deep genomic analysis that reveals why Ata looks as she does.

As they lay out in a paper published this week in Genome Research, the researchers found a host of genetic mutations that doomed the fetus—some of which have never been seen before.

Stanford professor of microbiology and immunology Garry Nolan first analyzed Ata back in 2012; the mummy had been purchased by a Spanish businessman and studied by a doctor named Steven Greer, who made her a star of his UFO/ET conspiracy movie Sirius. Nolan was also given a sample of her bone marrow; his DNA analysis confirmed she was, of course, human. But Nolan's study, published in the journal Science, also found something very odd: Though she was just 6 inches long when she died—a typical size for a midterm fetus—her bones appeared to be 6 to 8 years old. This did not lead Nolan to hypothesize an alien origin for Ata, but to infer that she may have had a rare bone disorder.

The current analysis confirmed that interpretation. The researchers found 40 mutations in several genes that govern bone development; these mutations have been linked to "diseases of small stature, rib anomalies, cranial malformations, premature joint fusion, and osteochondrodysplasia (also known as skeletal dysplasia)," they write. The latter is commonly known as dwarfism. Some of these mutations are linked to conditions including Ehlers-Danlos syndrome, which affects connective tissue, and Kabuki syndrome, which causes a range of physical deformities and cognitive issues. Other mutations known to cause disease had never before been associated with bone growth or developmental disorders until being discovered in Ata.

scientist measures the the 6-inch-long mummy called Ata, which is not an alien
Emery Smith

"Given the size of the specimen and the severity of the mutations … it seems likely the specimen was a pre-term birth," they write. "While we can only speculate as to the cause for multiple mutations in Ata's genome, the specimen was found in La Noria, one of the Atacama Desert's many abandoned nitrate mining towns, which suggests a possible role for prenatal nitrate exposure leading to DNA damage."

Though the researchers haven't identified the exact age of Ata's remains, they're estimated to be less than 500 years old (and potentially as young as 40 years old). Genomic analysis also confirms that Ata is very much not only an Earthling, but a local; her DNA is a nearest match to three individuals from the Chilote people of Chile.

In a press statement, study co-lead Atul Butte, director of the Institute for Computational Health Sciences at UC-San Francisco, stressed the potential applications of the study to genetic disorders. "For me, what really came of this study was the idea that we shouldn't stop investigating when we find one gene that might explain a symptom. It could be multiple things going wrong, and it's worth getting a full explanation, especially as we head closer and closer to gene therapy," Butte said. "We could presumably one day fix some of these disorders."

Just Two Cans of Soda a Day May Double Your Risk of Death From Heart Disease

If you've been stocking your refrigerator full of carbonated corn syrup in anticipation of warmer weather, the American Heart Association has some bad news. The advocacy group on Wednesday released results of research that demonstrate a link between consumption of sugary drinks—including soda, fruit juices, and other sweetened beverages—and an increased risk of dying from heart disease.

Study participants who reported consuming 24 ounces or more of sugary drinks per day had twice the risk of death from coronary artery disease of those who averaged less than 1 ounce daily. There was also an increased risk of death overall, including from other cardiovascular conditions.

The study, led by Emory University professor Jean Welsh, examined data taken from a longitudinal study of 17,930 adults over the age of 45 with no previous history of heart disease, stroke, or diabetes. Researchers followed participants for six years, and examined death records to determine causes. They observed a greater risk of death associated with sugary drinks even when they controlled for other factors, including race, income, education, smoking habits, and physical activity. The study does not show cause and effect, the researchers said, but does illuminate a trend.

The study also noted that while it showed an increased risk of death from heart disease, consumption of sugary foods was not shown to carry similar risk. One possible explanation is that the body metabolizes the sugars differently: Solid foods carry other nutrients, like fat and protein, that slow metabolism, while sugary drinks provide an undiluted influx of carbohydrates that the body must process.

The news will likely prove troublesome for the beverage industry, which has long contended with concerns that sugary drinks contribute to type 2 diabetes and tooth decay. Some cities, including Seattle, have introduced controversial "soda tax" plans that raise the sales tax on the drinks in an effort to discourage consumption.


More from mental floss studios