What Is Antibiotic Resistance?


The news is full of terms like "superbug," "post-antibiotic era," and an alphabet soup of abbreviations including NDM-1, MCR-1 (both antibiotic resistance genes), MRSA (a type of antibiotic-resistant bacteria), and others. These all refer to various aspects of antibiotic resistance—the ability of bacteria to out-maneuver the drugs which are supposed to kill them and stop an infection.

Now, there is concern that we could move back into a situation like that which existed in the early 20th century—a post-antibiotic era. Mental Floss spoke to Meghan Davis, a veterinarian and assistant professor of epidemiology at Johns Hopkins University, about some of the potential outcomes of losing antibiotics. "We have generations of recorded history that identify the risks to human society from infectious diseases that we are unable to treat or prevent," Davis warns.


If an individual becomes ill due to a bacterial infection, they typically see their physician for treatment. But in the years before antibiotics were discovered, people frequently died from scenarios we find difficult to fathom, including mere cuts or scratches that led to untreatable infections. Ear infections or urinary tract infections could lead to sepsis (bacteria in the blood). Arms or legs were surgically removed before an infected wound could lead to death.

When antibiotics were discovered, it's no surprise they were referred to as a "magic bullet" (or Zauberkugel in German, as conceived by medical pioneer Paul Ehrlich [PDF]). The drugs could wipe out an infection but not harm the host. They allowed people to recover from even the most serious of infections, and heralded a new era in medicine where people no longer feared bacteria.

Davis says the existence of antibiotics themselves has changed how we use medicine. Many medical procedures now rely on antibiotics to treat infections that may result from the intervention. "What is different about a post-antibiotic modern world is that we have established new patterns of behavior and medical norms that rely on the success of antimicrobial treatments," she says. "Imagine transplant or other major surgeries without the ability to control opportunistic infections with antibiotics. Loss of antibiotics would challenge many of our medical innovations."


One reason antibiotic resistance is difficult to control is that our antibiotics are derivatives of natural products. Our first antibiotic, penicillin, came from a common mold. Fungi, bacteria, parasites, and viruses all produce products to protect themselves as they battle each other in their microbial environments. We've taken advantage of the fruits of millions of years' worth of these invisible wars to harness antibiotics for our use. (This is also why we can find antibiotic resistance genes even in ancient bacteria that have never seen modern antibiotic drugs—because we've exploited the chemicals they use to protect themselves).

These microbes have evolved ways to evade their enemies—antibiotic resistance genes. Sometimes the products of these genes will render the antibiotic useless by chopping it into pieces or pumping it out of the bacterial cell. Importantly, these resistance genes can be swapped among different bacterial species like playing cards. Sometimes the genes will be useless because the bacteria aren't being exposed to a particular drug, but sometimes they'll be dealt an ace and survive while others die from antibiotic exposure.

And many of these resistance genes are already out there in the bacterial populations. Imagine just one in a million bacterial cells that are growing in a human gut have a resistance gene already in their DNA. When a person takes a dose of antibiotics, all the susceptible bacteria will die off—but that one-in-a-million bacterium that can withstand the antibiotic suddenly has a lot of room to replicate, and the population of bacteria carrying that resistance gene will dramatically increase.

If the person then transfers those resistant gut bacteria to others, resistance can spread as well. This is why it's important to keep control over antibiotic use in all populations—because someone else's use of the drugs can potentially make your own bacteria resistant to antibiotics. This is also why hand washing is important: You can unknowingly pick up new bacteria all the time from other people, animals, or surfaces. Washing your hands will send most of these passenger bacteria down the sink drain, instead of allowing them to live on your body.


Most importantly, never ask for antibiotics from your doctor; if you have a bacterial infection that can be treated by antibiotics, your doctor will prescribe them. Many illnesses are due to viruses (such as the common cold), but antibiotics only work against bacteria. It is useless to take antibiotics for a virus, and doing so will only breed resistance in the other bacteria living in your body, which can predispose you or others in your household and community to developing an antibiotic-resistant infection. Remember, those resistant bacteria can linger in your body—in your gut, on your skin, in your mouth and elsewhere, and can swap resistance genes from the mostly harmless bacteria you live with to the nasty pathogens you may encounter, further spreading resistance in the population.

Antibiotics are also used in animals, including livestock. Purchasing meat that is labeled "raised without antibiotics" will reduce your chance of acquiring antibiotic-resistant bacteria that are generated on the farm and can be spread via meat products.

Davis notes clients often requested antibiotics for their pets as well, even when it was an issue that did not require them. She explained to them why antibiotics were not necessary. She counsels, "Individuals can partner with their physician and veterinarian to promote good antimicrobial stewardship. Use of antibiotics carries risks, and these risks are related both to side effects and to promotion of resistance. Therefore, decisions to use antibiotics should be treated with caution and deliberation."

Dean Mouhtaropoulos/Getty Images
Essential Science
What Is a Scientific Theory?
Dean Mouhtaropoulos/Getty Images
Dean Mouhtaropoulos/Getty Images

In casual conversation, people often use the word theory to mean "hunch" or "guess": If you see the same man riding the northbound bus every morning, you might theorize that he has a job in the north end of the city; if you forget to put the bread in the breadbox and discover chunks have been taken out of it the next morning, you might theorize that you have mice in your kitchen.

In science, a theory is a stronger assertion. Typically, it's a claim about the relationship between various facts; a way of providing a concise explanation for what's been observed. The American Museum of Natural History puts it this way: "A theory is a well-substantiated explanation of an aspect of the natural world that can incorporate laws, hypotheses and facts."

For example, Newton's theory of gravity—also known as his law of universal gravitation—says that every object, anywhere in the universe, responds to the force of gravity in the same way. Observational data from the Moon's motion around the Earth, the motion of Jupiter's moons around Jupiter, and the downward fall of a dropped hammer are all consistent with Newton's theory. So Newton's theory provides a concise way of summarizing what we know about the motion of these objects—indeed, of any object responding to the force of gravity.

A scientific theory "organizes experience," James Robert Brown, a philosopher of science at the University of Toronto, tells Mental Floss. "It puts it into some kind of systematic form."


A theory's ability to account for already known facts lays a solid foundation for its acceptance. Let's take a closer look at Newton's theory of gravity as an example.

In the late 17th century, the planets were known to move in elliptical orbits around the Sun, but no one had a clear idea of why the orbits had to be shaped like ellipses. Similarly, the movement of falling objects had been well understood since the work of Galileo a half-century earlier; the Italian scientist had worked out a mathematical formula that describes how the speed of a falling object increases over time. Newton's great breakthrough was to tie all of this together. According to legend, his moment of insight came as he gazed upon a falling apple in his native Lincolnshire.

In Newton's theory, every object is attracted to every other object with a force that’s proportional to the masses of the objects, but inversely proportional to the square of the distance between them. This is known as an “inverse square” law. For example, if the distance between the Sun and the Earth were doubled, the gravitational attraction between the Earth and the Sun would be cut to one-quarter of its current strength. Newton, using his theories and a bit of calculus, was able to show that the gravitational force between the Sun and the planets as they move through space meant that orbits had to be elliptical.

Newton's theory is powerful because it explains so much: the falling apple, the motion of the Moon around the Earth, and the motion of all of the planets—and even comets—around the Sun. All of it now made sense.


A theory gains even more support if it predicts new, observable phenomena. The English astronomer Edmond Halley used Newton's theory of gravity to calculate the orbit of the comet that now bears his name. Taking into account the gravitational pull of the Sun, Jupiter, and Saturn, in 1705, he predicted that the comet, which had last been seen in 1682, would return in 1758. Sure enough, it did, reappearing in December of that year. (Unfortunately, Halley didn't live to see it; he died in 1742.) The predicted return of Halley's Comet, Brown says, was "a spectacular triumph" of Newton's theory.

In the early 20th century, Newton's theory of gravity would itself be superseded—as physicists put it—by Einstein's, known as general relativity. (Where Newton envisioned gravity as a force acting between objects, Einstein described gravity as the result of a curving or warping of space itself.) General relativity was able to explain certain phenomena that Newton's theory couldn't account for, such as an anomaly in the orbit of Mercury, which slowly rotates—the technical term for this is "precession"—so that while each loop the planet takes around the Sun is an ellipse, over the years Mercury traces out a spiral path similar to one you may have made as a kid on a Spirograph.

Significantly, Einstein’s theory also made predictions that differed from Newton's. One was the idea that gravity can bend starlight, which was spectacularly confirmed during a solar eclipse in 1919 (and made Einstein an overnight celebrity). Nearly 100 years later, in 2016, the discovery of gravitational waves confirmed yet another prediction. In the century between, at least eight predictions of Einstein's theory have been confirmed.


And yet physicists believe that Einstein's theory will one day give way to a new, more complete theory. It already seems to conflict with quantum mechanics, the theory that provides our best description of the subatomic world. The way the two theories describe the world is very different. General relativity describes the universe as containing particles with definite positions and speeds, moving about in response to gravitational fields that permeate all of space. Quantum mechanics, in contrast, yields only the probability that each particle will be found in some particular location at some particular time.

What would a "unified theory of physics"—one that combines quantum mechanics and Einstein's theory of gravity—look like? Presumably it would combine the explanatory power of both theories, allowing scientists to make sense of both the very large and the very small in the universe.


Let's shift from physics to biology for a moment. It is precisely because of its vast explanatory power that biologists hold Darwin's theory of evolution—which allows scientists to make sense of data from genetics, physiology, biochemistry, paleontology, biogeography, and many other fields—in such high esteem. As the biologist Theodosius Dobzhansky put it in an influential essay in 1973, "Nothing in biology makes sense except in the light of evolution."

Interestingly, the word evolution can be used to refer to both a theory and a fact—something Darwin himself realized. "Darwin, when he was talking about evolution, distinguished between the fact of evolution and the theory of evolution," Brown says. "The fact of evolution was that species had, in fact, evolved [i.e. changed over time]—and he had all sorts of evidence for this. The theory of evolution is an attempt to explain this evolutionary process." The explanation that Darwin eventually came up with was the idea of natural selection—roughly, the idea that an organism's offspring will vary, and that those offspring with more favorable traits will be more likely to survive, thus passing those traits on to the next generation.


Many theories are rock-solid: Scientists have just as much confidence in the theories of relativity, quantum mechanics, evolution, plate tectonics, and thermodynamics as they do in the statement that the Earth revolves around the Sun.

Other theories, closer to the cutting-edge of current research, are more tentative, like string theory (the idea that everything in the universe is made up of tiny, vibrating strings or loops of pure energy) or the various multiverse theories (the idea that our entire universe is just one of many). String theory and multiverse theories remain controversial because of the lack of direct experimental evidence for them, and some critics claim that multiverse theories aren't even testable in principle. They argue that there's no conceivable experiment that one could perform that would reveal the existence of these other universes.

Sometimes more than one theory is put forward to explain observations of natural phenomena; these theories might be said to "compete," with scientists judging which one provides the best explanation for the observations.

"That's how it should ideally work," Brown says. "You put forward your theory, I put forward my theory; we accumulate a lot of evidence. Eventually, one of our theories might prove to obviously be better than the other, over some period of time. At that point, the losing theory sort of falls away. And the winning theory will probably fight battles in the future."

Photo by Fox Photos/Getty Images
Essential Science
How Are Vaccines Made?
Quality checks on the Salk polio vaccine at Glaxo's virus research laboratory in Buckinghamshire, UK, in January 1956.
Quality checks on the Salk polio vaccine at Glaxo's virus research laboratory in Buckinghamshire, UK, in January 1956.
Photo by Fox Photos/Getty Images

Vaccines have long been hailed as one of our greatest public health achievements. They can be made to protect us from infections with either viral or bacterial microbes. Measles and smallpox, for example, are viruses; Streptococcus pneumoniae is a bacterium that causes a range of diseases, including pneumonia, ear and sinus infections, and meningitis. Hundreds of millions of illnesses and deaths have been prevented due to vaccines that eradicated smallpox and significantly reduced polio and measles infections. However, some misunderstanding remains regarding how vaccines are made, and why some scary-sounding ingredients [PDF] are included in the manufacturing process.

The production of our vaccines has greatly evolved since the early days, when vaccination was potentially dangerous. Inoculating an individual with ground-up smallpox scabs usually led to a mild infection (called "variolation"), and protected them from acquiring the disease the "regular" way (via the air). But there was always a chance the infection could still be severe. When Edward Jenner introduced the first true vaccination with cowpox, protection from smallpox became safer, but there were still issues: The cowpox material could be contaminated with other germs, and sometimes was transmitted from one vaccinated person to another, leading to the inadvertent spread of blood-borne pathogens. We’ve come far in the last 200 years.

There are different kinds of vaccines, and each requires different processes to move from the laboratory to your physician's office. The key to all of them is production of one or more antigens—the portion of the microbe that triggers a host immune response.


There are several methods to produce antigens. One common technique is to grow a virus in what's called a cell culture. Typically grown in large vats called bioreactors, living cells are inoculated with a virus and placed in a liquid growth medium that contains nutrients—proteins, amino acids, carbohydrates, essential minerals—that help the virus grow in the cells, producing thousands of copies of itself in each infected cell. At this stage the virus is also getting its own dose of protective medicine: antibiotics like neomycin or polymyxin B, which prevent bacterial and fungal contamination that could kill the cells serving as hosts for the virus.

Once a virus completes its life cycle in the host cell, the viruses are purified by separating them from the host cells and growth media, which are discarded. This is often done using several different types of filters; the viruses are small and can pass through holes in the filter that trap larger host cells and cell debris.

This is how "live attenuated vaccines" are created. These vaccines contain viruses that have been modified so that they are no longer harmful to humans. Some of them are grown for many generations in cells that aren't human, such as chicken cells, so that they have mutated to no longer cause harm to humans. Others, like the influenza nasal mist, were grown at low temperatures until they lost the ability to replicate in the warmer temperatures of the lungs. Many of these vaccines you were probably given as a child: measles, mumps, rubella ("German measles"), and chickenpox.

Live attenuated vaccines replicate briefly in the body, triggering a strong—and long-lasting—response from your immune system. Because your immune system kicks into high gear at what it perceives to be a major threat, you need fewer doses of the vaccine for protection against these diseases. And unlike the harmful form of the virus, it is extremely unlikely (because they only replicate at low levels) that these vaccines will cause the host to develop the actual disease, or to spread it to other contacts. One exception is the live polio vaccine, which could spread to others and, extremely rarely, caused polio disease (approximately one case of polio from 3 million doses of the virus). For this reason, the live polio virus was discontinued in the United States in 2000.

Scientists use the same growth technique for what are known as "killed" or "inactivated" vaccines, but they add an extra step: viral death. Inactivated viruses are killed, typically via heat treatment or use of a chemical such as formaldehyde, which modifies the virus's proteins and nucleic acids and renders the virus unable to replicate. Inactivated vaccines include Hepatitis A, the injected polio virus, and the flu shot.

A dead virus can't replicate in your body, obviously. This means that the immune response to inactivated vaccines isn't as robust as it is with live attenuated vaccines; replication by the live viruses alerts many different types of your immune cells of a potential invader, while killed vaccines primarily alert only one part of your immune system (your B cells, which produce antibodies). That's why you need more doses to achieve and maintain immunity.

While live attenuated vaccines were the primary way to make vaccines until the 1960s, concerns about potential safety issues, and the difficulty of making them, mean that few are attempting to develop new live attenuated vaccines today.


Other vaccines aren't made of whole organisms at all, but rather bits and pieces of a microbe. The combination vaccine that protects against diphtheria, pertussis, and tetanus—all at once—is one example. This vaccine is called the DTaP for children, and Tdap for adults. It contains toxins (the proteins that cause disease) from diphtheria, pertussis, and tetanus bacteria that have been inactivated by chemicals. (The toxins are called "toxoids" once inactivated.) This protects the host—a.k.a. you, potentially—from developing clinical diphtheria and tetanus disease, even if you are exposed to the microorganisms. (Some viruses have toxins—Ebola appears to, for example—but they're not the key antigens, so they're not used for our current vaccines.)

As they do when developing live attenuated or inactivated vaccines, scientists who create these bacterial vaccines need some target bacteria to culture. But because the bacteria don't need a host cell to grow, they can be produced in simple nutrient broths by vaccine manufacturers. The toxins are then separated from the rest of the bacteria and growth media and inactivated for use as vaccines.

Similarly, some vaccines contain just a few antigens from a bacterial species. Vaccines for Streptococcus pneumoniae, Haemophilus influenzae type B, and Neisseria meningitidis all use sugars that are found on the outer part of the bacteria as antigens. These sugars are purified from the bacteria and then bound to another protein to enhance the immune response. The protein helps to recruit T cells in addition to B cells and create a more robust reaction.

Finally, we can also use genetic engineering to produce vaccines. We do this for Hepatitis B, a virus that can cause severe liver disease and liver cancer. The vaccine for it consists of a single antigen: the hepatitis B surface antigen, which is a protein on the outside of the virus. The gene that makes this antigen is inserted into yeast cells; these cells can then be grown in a medium similar to bacteria and without the need for cell culture. The hepatitis B surface antigen is then separated from the yeast and serves as the primary vaccine component.


Once you have the live or killed viruses, or purified antigens, sometimes chemicals need to be added to protect the vaccine or to make it work better. Adjuvants, such as aluminum salts, are a common additive; they help enhance the immune response to some antigens by keeping the antigen in contact with the cells of the immune system for a longer period of time. Vaccines for DTaP/Tdap, meningitis, pneumococcus, and hepatitis B all use aluminum salts as an adjuvant.

Other chemicals may be added as stabilizers, to help keep the vaccine working effectively even in extreme conditions (such as hot temperatures). Stabilizers can include sugars or monosodium glutamate (MSG). Preservatives can be added to prevent microbial growth in the finished product.

For many years, the most common preservative was a compound called thimerosal, which is 50 percent ethylmercury by weight. Ethylmercury doesn't stick around; your body quickly eliminates it via the gut and feces. (This is different from methylmercury, which accumulates in fish and can, at high doses, cause long-lasting damage in humans.) In 2001, thimerosal was removed from the vaccines given in childhood due to consumer concerns, but many studies have demonstrated its safety.

Finally, the vaccine is divided into vials for shipping to physicians, hospitals, public health departments, and some pharmacies. These can be single-dose or multi-dose vials, which can be used for multiple patients as long as they're prepared and stored away from patient treatment areas. Preservatives are important for multi-dose vials: bacteria and fungi are very opportunistic, and multiple uses increase the potential for contamination of the vaccine. This is why thimerosal is still used in some multi-dose influenza vaccines.

Though some of the vaccine ingredients sound worrisome, most of these chemicals are removed during multiple purification steps, and those that remain (such as adjuvants) are necessary for the vaccine's effectiveness, are present in very low levels, and have an excellent track record of safety.