What Is a GMO?


If you've followed the debate about GMOs with any sort of regularity, there's a strong chance you've come across a picture of a tomato stabbed by a giant syringe. That image, though a complete fiction, seems to perfectly capture what's preventing public acceptance of these foods: We don't really know what makes something a GMO.

GMOs aren't made with syringes and, at the moment, they aren't even made with tomatoes, at least not commercially. But that false image is everywhere, and surveys indicate consumers fear GMOs without knowing much about them.

So what exactly is a GMO?


The initialism stands for "genetically modified organism," but it's a term lacking scientific precision. Moreover, it's hard to find an organism in any way connected to humans that hasn't been genetically modified, says Alison Van Eenennaam, a geneticist at UC-Davis who specializes in animal biotechnology. "I might argue that a great Dane or a Corgi are 'genetically modified' relative to their ancestor, the wolf," she tells Mental Floss. "'GMO' is not a very useful term. Modified for what and why is really the more important question.”

GMOs are often described as if they were a recent invention of our industrial food system, but genetic modification of food isn't new at all. It's been happening for many millennia: As long as farmers have been saving high-performing seeds for future harvests, we've had GMOs. Perhaps the earliest known example of a GMO is the sweet potato, which scientists believe became modified when wild sweet potatoes became infected, quite naturally, by a particular kind of soil bacteria. Realizing these sweet potatoes were edible, people began saving the seeds and cultivating them for future harvests. That was about 8000 years ago.

These days, when people say "GMO," they tend to mean one particular modification method that scientists refer to as transgenesis. As Van Eenennaam explains, transgenesis is "a plant-breeding method whereby useful genetic variation is moved from one species to another using the methods of modern molecular biology, also known as genetic engineering."

Transgenic crops and animals have been modified with the addition of one or more genes from another living organism, using either a "gene gun," Agrobacteria—a genus of naturally occurring bacteria that insert DNA into plants—or electricity, in a process called electroporation.

The first commercial transgenic crops debuted in the early 1990s: a virus-resistant tobacco in China [PDF] and the Flavr-Savr tomato in the U.S., which was genetically altered to not get "squishy." (It's no longer on the market.)

As to the health risks of GMO foods, the scientific consensus is clear: Transgenic crops are no riskier than other crops. Van Eenennaam points to a 20-year history of safe use that includes "thousands of studies, eleven National Academies reports, and indeed [the consensus of] every major scientific society in the world."


Today, the most ubiquitous transgenic crops in the U.S. food system are cotton, soybeans, and corn, including those modified to resist the effects of the herbicide Roundup. Branded "Roundup Ready," these crops have been modified so that farmers can apply the herbicide directly to crops to control weeds without killing the crops themselves.

For farmers, the result was better weed control and higher yields. For critics of GMOs, these crops became their smoking gun. These opponents argue they're bad for the planet and bad for our health.

There's no question that use of glyphosate, the active ingredient in the herbicide Roundup, has increased since the introduction of GMOs, but measuring its environmental impact is a far more complex equation. For example, as glyphosate use has increased, so has the prevalence of conservation tillage, a beneficial agricultural approach that helps sequester carbon in the soil and mitigate the impacts of climate change.

Bt crops—transgenic crops modified with genes from the all-natural bacterial toxin Bt, short for Bacillus thuringiensis—have also reduced the use of insecticide, according to a 2016 National Academies of Science report.

And though evidence suggests herbicide use has increased since Roundup Ready GMOs were first commercialized in the U.S., herbicide use has increased amongst some non-GMO crops, too. Glyphosate also replaced more toxic herbicides on the market and, if farmers were to stop using it, many would likely replace glyphosate with another herbicide, possibly one that's more toxic. Glyphosate-resistant weeds are a problem, but banning glyphosate, or glyphosate-resistant GMOs for that matter, wouldn't solve the problem.

In recent years, opponents of GMOs have increasingly aimed their fire at glyphosate. The source of many of these claims is a 2015 assessment [PDF] by the International Agency for Research on Cancer (IARC) to categorize glyphosate as "probably carcinogenic." That categorization has been hotly contested by many scientists, as other governmental agencies have concluded glyphosate does not pose a carcinogenic hazard. And, in June, reporting revealed that the lead researcher at IARC withheld important studies from the research group's consideration.

Weighing criticisms of glyphosate against its benefits certainly brings up complex issues in our agricultural system, but ultimately these issues are not unique to GMOs nor would they magically disappear if transgenic technology were eliminated altogether.


Most consumers probably can't name all the different methods of genetic modification, but there's a good chance they've eaten foods modified by one of these methods all the same. Layla Katiraeea human molecular geneticist at Integrated DNA Technologies and a science communicator, has written about these methods to illustrate why it makes little sense to single out transgenic crops. Examples include polyploidy, which gave us the seedless watermelon, and mutagenesis, which scientists used to engineer a brightly colored grapefruit. As Katiraee points out, sometimes two different methods can even create a very similar end result. For example, the non-browning Opal apple was developed using traditional cross-breeding, while the non-browning Arctic apple uses transgenic methods to silence the genes that control browning.

Katiraee says the most common objections to GMOs aren't exclusive to transgenic crops: “Don't like ‘Big Ag'? They use all methods of crop modification. Don't like herbicide-tolerant crops? They've been made with other methods. Don't like patents? Crops modified by all methods are patented. If you go through the list, you won't find one [objection] that applies exclusively to transgenesis.”

Katiraee's arguments illustrate why it doesn't make sense to label transgenic crops "GMO" while omitting the non-browning opal apple or a seedless watermelon. And the non-GMO label can often be misleading. Van Eenennaam points to one of the more ridiculous examples: non-GMO salt. "Salt doesn't contain DNA, so salt cannot be genetically engineered," she says. "All salt is 'non-GMO' salt."


The noisy GMO debate has often overshadowed the successes of lesser known, disease-resistant GMOs. Van Eenennaam argues that no one should object to these crops since protecting “plants and animals from disease aligns with most everyone's common interest in decreasing the use of chemicals in agricultural production systems, and minimizing the environmental footprint of food production." Examples include ringspot virus–resistant papaya in Hawaii [PDF] and the American chestnut, both rescued from the devastating effects of lethal plant viruses.

Disease-resistant crops often face an uphill battle for approval. In Uganda, scientists developed a disease-resistant banana that then faced difficult regulatory obstacles until a new law was finally approved in October by the country's Parliament. In Florida, where the disease called citrus greening has caused widespread crop damage and loss to the citrus industry, orange trees have been modified with a spinach gene to help crops resist the virus. But orange juice manufacturers will have to persuade consumers to buy it. 

Scientists have used transgenic modification to address health concerns too. For example, some variations of the wilt-resistant banana also include a boost of vitamin A. Scientists are working on a form of wheat that would be safe for people with celiac disease.

Van Eenennaam fears the controversy over GMOs has meant that, over the years, the public has missed out on important technologies. In the field of animal biotechnology, for example, animals have been produced that are resistant to disease, "that produce less pollution in their manure, [and] that have … elevated levels of omega-3 fatty acids," but none of these have been commercialized in the U.S.

Given that these crops and animals have a 20-year history of safe use, Van Eenennaam argues there's no reason that "fungus-resistant strawberries, disease-resistant bananas, and virus-resistant animals [should] sit on the shelf" unused.

Editor's note: This post has been updated. 

Essential Science
What Is Death?

The only thing you can be certain about in life is death. Or is it? Merriam-Webster defines death as "a permanent cessation of all vital functions." The Oxford English dictionary refines that to "the permanent ending of vital processes in a cell or tissue." But determining when someone is dead is surprisingly complicated—the medical definition has changed over the centuries and, in many ways, is still evolving.


For most of human history, doctors relied on basic observations to determine whether or not a person had died. (This may be why so many feared being buried alive and went to great lengths to ensure they wouldn't be.) According to Marion Leary, the director of innovation research for the Center for Resuscitation Science at the University of Pennsylvania, "If a person wasn't visibly breathing, if they were cold and bluish in color, for example, they would be considered dead."

As time went on, the markers for death changed. Before the mid-1700s, for example, people were declared dead when their hearts stopped beating—a conclusion drawn from watching traumatic deaths such as decapitations, where the heart seemed to be the last organ to give up. But as our understanding of the human body grew, other organs, like the lungs and brain, were considered metrics of life—or death.

Today, that remains true to some degree; you can still be declared dead when your heart and lungs cease activity. And yet you can also be declared dead if both organs are still working, but your brain is not.

In most countries, being brain dead—meaning the whole brain has stopped working and cannot return to functionality—is the standard for calling death, says neuroscientist James Bernat, of the Geisel School of Medicine at Dartmouth College in New Hampshire. "A doctor has to show that the loss of brain function is irreversible," he tells Mental Floss. In some cases, a person can appear to be brain dead if they have overdosed on certain drugs or have suffered from hypothermia, for example, but the lack of activity is only temporary—these people aren't truly brain dead.

In the U.S., all states follow some form of the Uniform Determination of Death Act, which in 1981 defined a dead person as "an individual who has sustained either (1) irreversible cessation of circulatory and respiratory functions, or (2) irreversible cessation of all functions of the entire brain, including the brain stem."

But that's not the end of the story. In two states, New York and New Jersey, families can reject the concept of brain death if it goes against their religious beliefs. This makes it possible for someone to be considered alive in some states and dead in others.


In the past, if one of a person's three vital systems—circulation, respiration, and brain function—failed, the rest would usually stop within minutes of each other, and there was no coming back from that. But today, thanks to technological advances and medical breakthroughs, that's no longer necessarily the case. CPR can be performed to restart a heartbeat; a person who has suffered cardiac arrest can often be resuscitated within a 20- to 30-minute window (in rare cases, people have been revived after several hours). And since the 1950s, machines have been used to take on the role of many of the body's vital functions. People who stop breathing naturally can be hooked up to ventilators to move air in and out of their lungs, for example.

While remarkable, this life-extending technology has blurred the line between life and death. "A person can now have certain characteristics of being alive and others of being dead," Bernat says.

People with severe, irreversible brain damage fall into this mixed category. Many lie in intensive care units where ventilators breathe for them, but because they have minimal reflexes or movements, they're considered alive, especially by their families. Medical professionals, however, may disagree, leading to painful and complex debates about whether someone is alive.

Take the case of Jahi McMath, whose tonsil surgery in 2013, at age 13, went terribly wrong, leaving her brain dead—or so doctors thought. Her family refused to believe she was dead and moved her from Oakland, California, to New Jersey, where she was provided with feeding tubes in addition to her ventilator. After several months, her mother began recording videos that she said were proof that Jahi could move different parts of her body when asked to. Additional brain scans revealed that although some parts of her brain, like her brain stem, were largely destroyed, the structure of large parts of her cerebrum, which is responsible for consciousness, language, and voluntary movements, was intact. Her heart rate also changed when her mother spoke, leading a neurologist to declare last year, after viewing many of her mother's videos, that she is technically alive—nearly four years after she was pronounced brain dead. By her mother's reckoning, Jahi turned 17 on October 24, 2017.

Organ donation adds another layer of complications. Since an organ needs to be transplanted as quickly as possible to avoid damage, doctors want to declare death as soon as they can after a person has been disconnected from a machine. The protocol is usually to wait for five minutes after a donor's heart and breathing have stopped. However, some believe that's not long enough, since the person could still be resuscitated at that point.

Bernat—whose research interests include brain death and the definition of death, consciousness disorders including coma and vegetative states, and ethical and philosophical issues in neurology—disagrees. "I would argue that breathing and circulation has permanently ceased even if it hasn't irreversibly ceased," he says. "It won't restart by itself."


As resuscitation technology improves, scientists may find new ways to reverse death. One promising approach is therapeutic hypothermia. Sometimes used on heart attack patients who have been revived, the therapy uses cooling devices to lower body temperature, usually for about 24 hours. "It improves a patient's chance of recovering from cardiac arrest and the brain injury [from a lack of oxygen] that can result from it," says Leary, who specializes in research and education relating to cardiac arrest, CPR quality, and therapeutic hypothermia.

One more out-there possibility—which had its heyday in the early 2000s but still has its proponents today—is cryonic freezing, in which dead bodies (and in some cases, just people's heads) are preserved in the hope that they can be brought back once technology advances. Just minutes after death, a cryonaut's body is chilled; a chest compression device called a thumper keeps blood flowing through the body, which is then shot up with anticoagulants to prevent blood clots from forming; and finally, the blood is flushed out and replaced with a kind of antifreeze to halt the cell damage that usually occurs from freezing.

The idea is highly controversial. "It makes a good story for a movie, but it seems crazy to me," Bernat says. "I don't think it's the answer." But even if cryogenics is out, Bernat does believe that certain types of brain damage now thought to be permanent could one day be subject to medical intervention. "There is currently a huge effort in many medical centers to study brain resuscitation," he says.

Genetics provides another potential frontier. Scientists recently found that some genes in mice and fish live on after they die. And even more surprisingly, other genes regulating embryonic development, which switch off when an animal is born, turn on again after death. We don't yet know if the same thing happens in humans.

Essential Science
What Is a Calorie?

The word calorie carries a lot of weight. We know we're supposed to avoid too many of them, but things get more complicated after that. What, exactly, are calories, and how do I burn them?


A calorie is a unit of heat energy that fuels your body, making it possible to move, breathe, think, sleep—and even digest food to make more energy.

While there is some disagreement about who first coined the term calorie, we know the French chemist Antoine Lavoisier used it in experiments he conducted during the winter of 1782–1783. He used a device called a calorimeter to measure how much ice melted in a metal container due to the heat emitted by guinea pigs housed inside it. Over time, that measurement was refined by other scientists to mean the amount of energy needed to raise the temperature of a kilogram of water by 1°C—what's known as a kilocalorie.

The food calorie and a kilocalorie (kcal) are technically the same thing, but we use the term calorie rather than kilocalorie because of an American chemist named Wilbur Olin Atwater. In the late 1880s, Atwater traveled to Germany to study at physiologist Carl Voit's laboratory, where Voit was researching the nutritional value of food and animal feed. Inspired by that research, Atwater took measurements of different foods with a bomb calorimeter—a device that essentially measures the heat in food when burned—by having study participants eat, and then measuring and subtracting [PDF] the amount of heat leaving their bodies through respiration and waste. He used a respiration calorimeter to measure their breath and a bomb calorimeter to burn their poop, and from that calculated just how many calories were left in their bodies to be used. When writing about his research, Atwater used the word calorie (kcal wouldn't be used in America until 1894, when it was published in a physiology textbook).

Based on his experiments, Atwater created a system for calculating the calories that human bodies can get from our food. There are three types of food nutrients that deliver caloric energy—fats, proteins, and carbohydrates—and Atwater arrived at a caloric measurement of each: A fat gram has nine calories, while a gram of protein and a gram of carbohydrates each have four. That system was modified [PDF] by USDA scientists in 1973, but it's otherwise still the basis for how calories are calculated today.


When you eat, enzymes in the mouth, stomach, and intestine break down nutrients by turning fats into fatty acids, sugars into simple sugars, and proteins into amino acids. Then, using oxygen cells throughout your body, these components are broken down into energy—a process known as metabolism.

Most of the calories we burn each and every day are used just to keep our body functioning, with about half going toward powering our major organs—the brain, liver, kidneys, and heart. We use the rest for physical activity and the process of converting food to energy. Anything not used by the body is then stored, first in the liver and eventually as fat cells.

Some foods, like honey (carbohydrates), are easily digestible, whereas nuts (a mix of carbohydrates, fat, and protein) can't actually be fully digested at all. There are also digestibility differences within the same type of food. For example, in plants, older leaves tend to be sturdier (and therefore harder to digest) and less caloric than younger ones. Most significantly, especially in terms of human evolution, whenever we cook or process food, the body can get more calories as compared to that same food eaten raw. All of this has an impact on the amount of calories we can actually use.

There's no food you can eat to speed up the rate at which you burn calories (changes from foods like spicy peppers are fleeting), but factors like age and rapid, drastic weight loss can slow it down.

Building more muscle can increase your metabolic rate (although how much is debatable), since muscle requires more energy to function than fat does. And while cardiovascular exercise might not permanently boost your metabolism, it does burn calories; just how much depends on your weight and how vigorously you exercise.

Examples of higher calorie burning exercises include cycling and running, but almost every activity burns something, so you could potentially burn more calories throughout the day by consistently doing low-energy activities like gardening or pacing during a conference call than you would during 30 minutes of fast cycling.


We still use the Atwater system for calculating food calories, but it's far from perfect. For one thing, a USDA study found that people absorbed fewer calories from nuts than had been estimated under Atwater's system—a serving of almonds, for example, provided not 170 calories, but 129. There's some evidence that people tend to digest food at all sorts of different rates too, depending on the individual makeup of our gut bacteria, meaning that the absorption of calories may differ from person to person.

Scientists now believe the numbers on food labels are more of an estimate than a precise measurement. While companies are required to provide caloric information on food labels, the FDA doesn't specify exactly how those calories should be calculated. Some companies, like McDonald's, send their food to a lab for measurement, while others estimate the total by adding up the calorie count for each food component from the USDA's massive food composition database. As scientists continue to refine how we calculate calories, we'll come to have a better idea of the energy we can actually get from these different foods.