Being Infected With Malaria Helped Ebola Victims to Survive
Ebola survivor James Harris, 29, stands for a portrait before a shift as a nurse's assistant at the Doctors Without Borders (MSF), Ebola treatment center on October 12, 2014 in Paynesville, Liberia. Image credit: John Moore/Getty Images
A recent study revealed a surprising finding: Of those infected in the West African Ebola epidemic in 2014, patients who had an active malaria parasite infection were actually more likely to survive the Ebola virus, and by a significant degree. While just over half (52 percent) of Ebola patients not infected with malaria survived, those co-infected with malaria had a survival rate of 72 to 83 percent, depending on their ages and the amount of Ebola virus in their blood.
What gives? Shouldn’t having a second, potentially deadly infection make you more likely to die of Ebola?
Maybe not. Though researchers aren’t yet sure of the mechanism by which malaria co-infection in Ebola patients might be protective, they have some ideas. The prevailing thought is that malaria is somehow modifying the immune response to Ebola, making it less deadly than in people who aren’t co-infected with the malaria parasite.
The authors of the study, published in the journal Clinical Infectious Diseases, note that malaria can make other infections less deadly. For example, in a group of children from Tanzania, those who had respiratory infections along with malaria were less likely to have those infections develop into pneumonia than kids who had respiratory infections without it.
It may be that malaria is able to tone down a phenomenon called the “cytokine storm”—the body’s own response to an Ebola infection that inadvertently kills the host while attempting to eliminate the pathogen. If malaria can turn this host response down, patients may have a better chance of surviving the virus’s assault.
This wouldn’t be the first time that malaria infection has been hailed as a hero, rather than an enemy. In 1927, the Nobel Prize in Physiology or Medicine was awarded to Julius Wagner-Jauregg “for his discovery of the therapeutic value of malaria inoculation in the treatment of dementia paralytica.” Wagner-Juaregg and others had observed that sometimes syphilis seemed to be cured following “febrile infectious diseases” as far back as 1887. He also noted in his Nobel speech that he had “singled out as a particular advantage of malaria that there is the possibility of interrupting the disease at will by the use of quinine, but I did not then anticipate to what degree these expectations from induced malaria would be fulfilled.” While there was no “cure” for syphilis at the time, and no cure for the other infection he had considered (erysipelas, usually caused by the same bacterium that causes strep throat and scarlet fever), malaria could be treated with quinine, a compound that we still use today.
Before Wagner-Juaregg’s “malariotherapy,” treatments for syphilis included mercury, Salvarsan (an arsenic-containing drug), and bismuth—all of which had serious side effects, including death. Wagner-Juaregg’s methods seemed to have no more risks than the conventional treatments of the era, and in 1917, he injected nine individuals suffering from advanced syphilis with malaria parasites. He reported three of them to be cured, and three more to have “extensive remission.” Soon, malariotherapy spread across the U.S. and into Europe, with tens of thousands of syphilis patients treated with the malaria parasite.
However, the degree to which malariotherapy worked is still a matter of controversy. And it was not without its own serious side effects, with death resulting in up to 15 percent of those treated. With the introduction of penicillin as a treatment for syphilis in the 1940s, malariotherapy was replaced, but the decades of use of malaria as a treatment significantly advanced our knowledge of the malaria parasite.
Today, scientists may be able to use this natural experiment to create drugs that could mimic malaria’s effect without actively infecting individuals. (Malaria is a devastating disease, causing hundreds of thousands of deaths every year, primarily in Africa.) Animal models could potentially be used to tease apart the host’s response to Ebola infection and determine how malaria alters the usual response to the Ebola virus to make it less deadly. These alterations could be used to create new drugs or other interventions to treat Ebola infection.
More importantly, further study of the phenomenon of malaria co-infection with other pathogens could lead to changes in patient care. The current standard operating procedure is to treat malaria infection when it is found in an Ebola case. But might it actually improve a patient's outcome to delay treatment for malaria? The authors of the current study note that a mouse model of malaria-Ebola co-infection found that treatment for malaria led to death from Ebola infection in all animals. And yet during the 2014 Ebola outbreak, work carried out at one Ebola treatment center in Liberia showed that Ebola fatality rates decreased with effective malaria treatment. Complicating the matter, the malarial drug used in that case (artesunate-amodiaquine, or ASAQ) may have been responsible for the anti-Ebola activity.
While it’s unlikely that a malaria treatment for Ebola would be as popular (or legal or ethical) as the “malariotherapy” of the early 1900s, it’s certainly worth closely examining the clues this co-infection has provided scientists about the nature of both Ebola and malaria infections—and how we could harness them to fight against one of nature’s most frightening diseases.