That Sugar Rush Is All In Your Head

iStock.com/egal
iStock.com/egal

We've all heard of the "sugar rush." It's a vision that prompts parents and even teachers to snatch candy away from kids, fearing they'll soon be bouncing off the walls, wired and hyperactive. It’s a myth American culture has clung to for decades—and these days, it’s not just a kid thing. Adults are wary of sugar, too. Some of this fear is warranted—diabetes, the obesity epidemic—but the truth is, sugar doesn't cause hyperactivity. Its impact on the body isn’t an up-and-down thing. The science is clear: There is no "sugar rush.”

To find out how and why the myth started, we need to go back to well before the first World War—then pay a visit to the 1970s.

Our Complicated Relationship With Sugar

According to cultural historian Samira Kawash, America has had a long, complex, love-hate relationship with sugar. In Candy: A Century of Panic and Pleasure, Kawash traces the turn from candy-as-treat to candy-as-food in the early 20th century. At that time, the dietary recommendations from scientists included a mix of carbohydrates, proteins, and fats, with sugar as essential for energy.

Not everyone was on board: The temperance movement, for example, pushed the idea that sugar caused an intoxication similar to alcohol, making candy-eaters sluggish, loopy, and overstimulated. In 1907, the chief of the Philadelphia Bureau of Health estimated that the "appetite" for candy and alcohol were "one and the same," Kawash writes. On the flip side, other scientists suggested that sugar from candy could stave off cravings for alcohol—a suggestion that candymakers then used in their advertisements.

While the debate about sugar as an energy source raged in America, militaries around the world were also exploring sugar as energy for soldiers. In 1898, the Prussian war office became the first to commission a study on the sweet stuff—with promising results: "Sugar in small doses is well-adapted to help men to perform extraordinary muscular labor," early researchers wrote. German military experiments introduced candy and chocolate cakes as fortification for the troops, and the U.S. military added sugary foods to soldiers' diets soon after. When American soldiers returned from World War I, they craved sweets, which "propelled an enormous boom" of candy sales that has lasted to this day, Kawash wrote on her blog, The Candy Professor. American advertisers framed candy as a quick, easy source of energy for busy adults during their workday.

As artificial sweeteners moved into kitchens in the 1950s, candymakers struggled to make their products appeal to women who were watching their waistlines. One industry group, Sugar Information Inc., produced a tiny "Memo to Dieters" pamphlet in 1954 designed to fit inside chocolate boxes. "Sugar before meals raises your blood sugar level and reduces your appetite," it claimed. But by the 1970s, the sugar-positivity heyday had started to wane.

The Origins of the Sugar Rush Myth

The idea that sugar causes hyperactivity gained traction in the early 1970s, when more attention was being paid to how diet might affect behavior. One of the major figures studying the possible connection between diet and behavior was an allergist named Benjamin Feingold, who hypothesized that certain food additives, including dyes and artificial flavorings, might lead to hyperactivity. He formalized this into a popular—yet controversial—elimination diet program. Though certain sugary foods were banned from the program for containing dyes and flavorings, sugar itself was never formally prohibited. Still, thanks in part of the Feingold diet, sugar started to become the poster child for diet and hyperactivity.

It wasn't until the late 1980s that serious doubts about sugar's connection to hyperactivity began to be raised by scientists. As FDA historian Suzanne White Junod wrote in 2003 [PDF], the 1988 Surgeon General's Report on Nutrition and Health concluded that "alleged links between sugar consumption and hyperactivity/attention deficit disorders in children had not been scientifically supported." Despite "mothers' mantra of no sweets before dinner," she noted, "more serious allegations of adverse pediatric consequences … have not withstood scientific scrutiny."

A 1994 paper found that aspartame—an artificial sweetener that had also been accused of inducing hyperactivity in children—had no effect on 15 children with ADHD, even though they had consumed 10 times more than the typical amount.

A year later, the Journal of the American Medical Association published a meta-analysis of the effect of sugar on children's behavior and cognition. It examined data from 23 studies that were conducted under controlled conditions: In every study, some children were given sugar, and others were given an artificial sweetener placebo like aspartame. Neither researchers nor children knew who received the real thing. The studies recruited neurotypical children, kids with ADHD, and a group who were "sensitive" to sugar, according to their parents.

The analysis found that "sugar does not affect the behavior or cognitive performance of children." (The authors did note that “a small effect of sugar or effects on subsets of children cannot be ruled out.”)

"So far, all the well-controlled scientific studies examining the relationship between sugar and behavior in children have not been able to demonstrate it," Mark Wolraich, an emeritus professor of pediatrics at the University of Oklahoma Health Sciences Center who has worked with children with ADHD for more than 30 years and the co-author of that 1995 paper, tells Mental Floss.

Yet the myth that consuming sugar causes hyperactivity hasn’t really gone away. One major reason is the placebo effect, which can have powerful results. The idea that you or your children might feel a "sugar rush" from too much candy isn't unlike the boost you hope to feel from an energy drink or a meal replacement shake or bar (which can contain several teaspoons of sugar). The same is true for parents who claim that their kids seem hyperactive at a party. Peer pressure and excitement seem to be to blame—not sugar.

"The strong belief of parents [in sugar's effects on children's behavior] may be due to expectancy and common association," Wolraich wrote in the JAMA paper.

It works the other way, too: Some parents say they've noticed a difference in their kids' behavior once they take out most sugars from their diets. This strategy, like the Feingold diet, continues to attract interest and followers because believing it works has an impact on whether it actually works or not.

Correlation, Causation, and Caffeine

Which isn't to say there are absolutely no links between sugar consumption and poor health outcomes. A 2006 paper found that drinking a lot of sugary soft drinks was associated with mental health issues, including hyperactivity, but the study's design relied on self-reported questionnaires that were filled out by more than 5000 10th-graders in Oslo, Norway. The authors also noted that caffeine is common in colas, which might have a confounding effect.

In another study, conducted by University of Vermont professor of economics Sara Solnick and Harvard health policy professor David Hemenway, the researchers investigated the so-called "Twinkie defense," in which sugar is said to contribute to an "altered state of mind." (The phrase Twinkie defense comes from the 1979 trial of Dan White for killing San Francisco city district supervisor Harvey Milk and Mayor George Moscone. His lawyers argued that White had "diminished capacity and was unable to premeditate his crime," as evidenced in part by his sudden adoption of a junk-food diet in the months before the murders. White was convicted of voluntary manslaughter.)

In their survey of nearly 1900 Boston public high schoolers, Solnick and Hemenway found "a significant and strong association between soft drinks and violence." Adolescents who drank more than five cans of soft drinks per week—nearly 30 percent of the group—were significantly more likely to have carried a weapon.

But Solnick tells Mental Floss the study isn't evidence of a "sugar rush."

"Even if sugar did cause aggression—which we did not prove—we have no way of knowing whether the effect is immediate (and perhaps short-lived) as the phrase 'sugar rush' implies, or whether it’s a longer-term process," she says. Sugar could, for example, increase irritability, which might sometimes flare up into aggression—but not as an immediate reaction to consuming sugar.

Harvard researchers are looking into the long-term effects of sugar using data from Project Viva, a large observational study of pregnant women, mothers, and their children. A 2018 paper in the American Journal of Preventive Medicine studied more than 1200 mother-child pairs from Project Viva, assessing mothers' self-reported diets during pregnancy as well as their children's health during early childhood.

"Sugar consumption, especially from [sugar-sweetened beverages], during pregnancy and childhood, and maternal diet soda consumption may adversely impact child cognition,” the authors concluded, though they noted that other factors could explain the association.

“This study design can look at relationships, but it cannot determine cause and effect,” says Wolraich, who was not involved in the study. "It is equally possible that parents of children with lower cognition are likely to cause a greater consumption of sugar or diet drinks, or that there is a third factor that influences cognition and consumption.”

The Science of the Sugar Crash

Though the evidence against the sugar rush is strong, a "sugar crash" is real—but typically it only affects people with diabetes.

According to the National Institute of Diabetes and Digestive and Kidney Diseases, low blood sugar—or hypoglycemia—is a serious medical condition. When a lot of sugar enters the bloodstream, it can spike the blood sugar level, causing fluctuation, instability, and eventually a crash—which is called reactive hypoglycemia. If a diabetic's blood sugar levels are too low, a number of symptoms—including shakiness, fatigue, weakness, and more—can follow. Severe hypoglycemia can lead to seizures and even coma.

For most of us, though, it's rare. Endocrinologist Dr. Natasa Janicic-Kahric told The Washington Post that "about 5 percent of Americans experience sugar crash."

You're more likely to experience it if you do a tough workout on an empty stomach. "If one exercises vigorously and doesn't have sufficient intake to supplement their use of calories, they can get lightheaded," Wolraich says. "But in most cases, the body is good at regulating a person's needs."

So what you're attributing to sugar—the highs and the lows—is probably all in your head.

9 Myths About Theodore Roosevelt

Topical Press Agency/Getty Images
Topical Press Agency/Getty Images

Our 26th president was a man larger than life—and is forever much larger than life, thanks to the fact that he's on the side of a mountain. But as with any such figure, myths and legends arise. So we’re here to explain the truth behind some popular stories about Theodore Roosevelt.

  1. Myth: Theodore Roosevelt pronounced his name differently than Franklin Delano Roosevelt.

There has long been disagreement about how to pronounce "Roosevelt." A 1902 New York Times article listed 14 different possibilities, from “ROSA-FELT” to “ROOZE-VELT,” “RUZY-VELL” to “RUZA-FELT.” The next year, Richard Mayne of the Department on Reading and Speech Culture, New York State Teachers’ Association, wrote to the Sun that the name was subject to 200 different pronunciations, but that most people pronounced the first syllable like "room." And legend has it that the two Roosevelt presidents pronounced their names differently. According to a 1984 article in the Washington Post, “Theodore Roosevelt's name rhymed with ‘goose.’ It was, to switch spellings a bit, ‘Ruse-a-velt.’ Franklin Roosevelt, a distant cousin, pronounced his name to rhyme with ‘rose’—‘Rose-a-velt.’ Since FDR served later and longer, his version has been generally adopted.”

Not so fast: We know that's not true, from TR's own pen. “As for my name, it is pronounced as if it was spelled ‘Rosavelt.’ That is in three syllables. The first syllable as if it was ‘Rose,’” he wrote in 1898. (He was used to the confusion, though; he wrote to his parents during his freshman year at Harvard that one of his teachers called him Rusee-felt, and that "hardly any one can get my name correctly, except as Rosy.") Later, FDR would confirm the same: In 1932, the Chicago Tribune verified with FDR's office—he was governor of New York at the time—that it was pronounced “Rose-a-velt.”

They weren't the only Roosevelts to weigh in: When Mayne wrote that most people pronounced the first syllable like "room," Theodore's uncle, Robert Barnwell Roosevelt, submitted a rebuttal. “It is rather a dangerous proceeding to assume that a man does not know how to pronounce his own name,” he wrote to the Sun, explaining that the family pronounced it “Rose-(uh)-velt.”

The two presidents may have agreed on the first part of their name, but maybe not the -velt part. Traditionally, Roosevelt is pronounced -velt, but in recordings of his many inaugurations, FDR pronounces his last name more like "rose-a-vult." So if a pronunciation difference does exist, it might be nearer the end of the name.

  1. Myth: Theodore Roosevelt rode a moose.

It’s a dramatic picture to be sure—Theodore Roosevelt riding a moose through a lake. It’s so ridiculously manly that it’s sometimes featured on lists of photographs you won’t believe aren’t photoshopped. But while this image wasn’t created using the popular image editing software, it’s still just as fake. It was part of a collage created for the 1912 presidential election, featuring Taft riding an elephant, Roosevelt riding a moose, and Wilson riding a donkey. In 2013, Houghton Library published a blog post detailing the story, with author Heather Cole explaining that it appears to have been an image of Roosevelt riding a horse where Roosevelt was cut out and pasted onto a separate picture of a moose. This also explains why focus, shadows, and most other features don’t match up between man and steed.

  1. Myth: Theodore Roosevelt created the modern image of piranhas.

It’s a story that has featured in countless adventure novels—a member of an expedition goes to the shore of the Amazon with just his mule. The mule returns to camp alone, causing a frantic search for the missing person. They come to the water’s edge and see a devoured skeleton. The culprit? Piranhas. Except that’s not from any dime novel, it’s a story related to Roosevelt by his companions that appears in Through the Brazilian Wilderness, published in 1914 and written by Roosevelt, which details his adventures in South America.

The book features several stories about piranhas: that they’ll “snap a finger off a hand incautiously trailed in the water” and can devour a cow alive. "The head with its short muzzle, staring malignant eyes, and gaping, cruelly armed jaws, is the embodiment of evil ferocity," he wrote.

Roosevelt's book is also commonly cited as being the origin of the reputation of piranhas as ferocious carnivores. But he wasn't the first to make that claim.

In 1880, Scientific American declared, “They make nothing of biting an ounce or so of flesh from a man’s leg. People are sometimes killed by them. Hence Brazilians are shy of going into these lakes and streams if they suspect the presence of these fish. The fishermen claim that piranhas will gather in schools against the larger fish and attack them.” And an account from around 30 years before Roosevelt was born notes that “The horses and cattle sip only from the [water’s] surface, and hardly dip their nose below it; notwithstanding which it is often bitten off. Even the cayman flies before this fierce enemy, and turns its belly, which is not provided with scales, to the surface of the water: only the otter, whose thick fur resists the effect of the bite, is secure against its attacks.”

But even if Roosevelt wasn't the origin of the myth, he likely did much to cement the idea in the minds of the public that piranhas are blood-thirsty creatures. In reality, the fish are typically pretty relaxed ... until they're spooked. And they more typically scavenge for their dinner. Some species are even vegetarians.

  1. Myth: Theodore Roosevelt cured his asthma with exercise.

In 2015, two researchers examined Theodore Roosevelt’s asthma, including the story that he cured it through exercise when he was around 12 years old. They found multiple references to asthma attacks when Roosevelt was an adult, such as after his first wife died and during a pillow fight with his children in the White House. Once, when his second wife was in labor, he took a train to get there, and his daughter remarked, “Both the engine and my father arrived in Oyster Bay wheezing.”

The researchers ultimately concluded that “in hindsight, it seems more likely that the improvement was coincident with the quiescence of asthma often seen in adolescence,” so Roosevelt himself may not have been completely responsible for his improved condition.

As for how the myth was perpetuated? Well, Roosevelt biographer Kathleen Dalton has an answer for that. "He ... encouraged his friends and authorized biographers to tell an upbeat, socially acceptable, stiff-upper-lipped version of his life," she writes. "He began, and they perpetuated, the myth that by force of will he cured himself of asthma." As his sister Corinne would write to a biographer, "he never did recover in a definite way—and indeed suffered from it all his life, though in later years only separated at long intervals."

  1. Myth: Theodore Roosevelt was inspired to be a conservationist thanks to a camping trip with John Muir.

In 1903, Roosevelt and John Muir—co-founder of the Sierra Club, and also its first president—went on a three-night camping trip that has been described as “the most significant camping trip in conservation history.” In the years that followed, Roosevelt would become known as an ardent conservationist—which is often implied as the legacy of this trip.

The only problem with that story is that, by 1903, Roosevelt had been fighting for conservation for years.

In the late 1880s, alongside George Bird Grinnell (editor-in-chief of Forest and Stream) and a few other sportsmen, Roosevelt co-founded and was the first president of the Boone and Crockett Club. According to historian John F. Reiger, “it, and not the Sierra Club, was the first private organization to deal effectively with conservation issues of national scope.”

As Roosevelt himself explained in March 1893, the club was a group of men “interested in big-game hunting, in big-game and forestry preservation, and generally in manly out-door sports, and in travel and exploration in little known regions.” One clause of its constitution was “To work for the preservation of the large game of this country, and, so far as possible, to further legislation for that purpose, and to assist in enforcing the existing laws.”

As president of the Boone and Crockett Club (a position he’d hold until 1894), Roosevelt worked to pass the Forest Reserve Act, which as President of the United States he’d use to preserve millions of acres of land. Historian Edmund Morris writes, “Thanks to the [Boone and Crockett Club’s] determined lobbying on Capitol Hill, in concert with other environmental groups, the Forest Reserve Act became law in March 1891 ... One wonders if [Roosevelt] ever paused, while signing millions of green acres into perpetuity, to acknowledge his debt to the youthful president of the Boone and Crockett Club.” The Boone and Crockett Club would also be instrumental in the protection of Yellowstone in 1894.

Then where does the story that the “the conservation president” began thanks to a hiking trip with Muir come from? Something definitely happened. In 1902, there were 26 establishments or modifications of national forest boundaries, according to the USDA [PDF]. In 1903, it was 17 (though this was still more than previous presidents—in 1900, there were three modifications). In 1905, it was 60.

Historian Anthony Godfrey has a theory—that it was because of Roosevelt’s role as an “accidental president” filling out McKinley’s term. Over his partial first term, he attracted to the Republican Party like-minded progressives, so, when he won in his own right in 1904, Roosevelt was in a position to change the nation’s forestry policy. No matter what the reason for the change in conservation tactics, though, Roosevelt had been drawn to the cause for years before the camping trip with Muir.

  1. Myth: Theodore Roosevelt invented the term Lunatic Fringe.

Roosevelt may lay claim to beginning its modern meaning—he wrote in 1913, “we have to face the fact that there is apt to be a lunatic fringe among the votaries of any forward movement”—but he was seemingly adapting an existing phrase: a literal lunatic fringe, which an 1875 newspaper described as “the fashion which our girls have got up of cropping the hair and letting the ends hang over the forehead. They used to call it ‘banging,’ but ‘lunatic fringe’ is the most appropriate.”

Indeed, Roosevelt's 1913 quote itself isn’t from a great political treatise; it’s in an article entitled “A Layman’s Views of an Art Exhibition.” In the same article he also said, “In this recent art exhibition the lunatic fringe was fully in evidence, especially in the rooms devoted to the Cubists and the Futurists.” (He went on, “There is no reason why people should not call themselves Cubists, or Octagonists, or Parallelopipedonists, or Knights of the Isosceles Triangle, or Brothers of the Cosine, if they so desire; as expressing anything serious and permanent, one term is as fatuous as another.”)

Roosevelt would eventually use the phrase more explicitly in a political context—after receiving a painting of one of his heroes, he proclaimed in a letter to a friend that “I am always having to fight the silly reactionaries and the inert, fatuous creatures who will not think seriously; and on the other hand to try to exercise some control over the lunatic fringe among the reformers.” But according to Safire’s Political Dictionary, the term was revived and given new life by FDR in the 1940s, who used it explicitly to refer to the “fear propaganda” that has “been used before in this country and others on the lunatic fringe.”

  1. Myth: Theodore Roosevelt was the first president not sworn in on a Bible.

The world of Bible usage during presidential inaugurations is a dicey one, as often (especially for early presidents) the evidence is inconclusive [PDF]. John Quincy Adams wrote, “I pronounced from a volume of the laws held up to me by John Marshall, Chief Justice of the United States, the oath faithfully to execute the office of President of the United States,” and LBJ used a Catholic missal after Kennedy was assassinated. Others are more obscure. For instance, Calvin Coolidge is often listed as being sworn in on the family Bible after Harding’s death, but in his autobiography, Coolidge explicitly noted that “The Bible which had belonged to my mother lay on the table at my hand. It was not officially used, as it is not the practice in Vermont or Massachusetts to use a Bible in connection with the administration of an oath.”

The assertion that Roosevelt didn’t use a Bible when he was inaugurated in 1901 after McKinley's assassination comes from Ansley Wilcox, the Buffalo resident who owned the home in which Roosevelt took the presidential oath. According to 1905’s Historic Bibles in America, Wilcox recalled, “no Bible was used, but President Roosevelt was sworn in with uplifted hand. As I recollect it, there was design in this. There were Bibles, and some quite interesting ones, in the room and readily accessible, but no one had thought of it in advance, there being little opportunity to prepare for this ceremony, and when Judge Hazel advanced to administer the oath to the new President he simply asked him to hold up his right hand, as is customary in this State. We seldom use Bibles in this State in administering oaths except in court rooms, and they are not required even in court rooms.”

  1. Myth: Theodore Roosevelt was the presidential savior of football.

Theodore Roosevelt was of critical importance to saving football, but Woodrow Wilson was also critical—though in his capacity as president of Princeton, not the United States.

In 1905, college football was becoming increasingly controversial due to multiple deaths and injuries, so Roosevelt summoned representatives from Harvard, Yale, and Princeton to “clean up” the sport. A committee met and drew up new rules (a more thorough discussion can be found here), and then Roosevelt largely stepped away from football reform.

Just a few years later, in 1909, Harper’s Weekly asked “Dr. Hadley, Dr. Lowell, Dr. Wilson”—a reference to the presidents of Yale, Harvard, and Princeton, respectively—“don’t you think football, as it was played this year, is a little rough? There had been twenty-seven deaths up to November 21st ... You could stop this kind of football if you chose, you three men. The mothers can’t, poor souls.” Wilson responded by writing Lowell and Hadley to have “an informal conference ... to save a very noble game.” The three schools met, and by May 1910 came up with a suite of new rules. According to a 1988 article by John S. Watterson, the rules that emerged were “seven men on the line of scrimmage, no pushing or pulling, no interlocking interference (arms linked or hands on belts and uniforms), and four fifteen-minute quarters,” as well as readopting the forward pass in a limited role.

Soon after the rules were widely adopted, Watterson explained that “In the years that followed the reforms on the gridiron, football evolved rapidly into the ‘attractive’ game that Wilson had advocated and a far less brutal game than the unruly spectacle that Roosevelt had tried to control.”

  1. Myth: The 1912 election was Theodore Roosevelt’s last attempt at the Presidency.

After Roosevelt lost the 1912 election, it might seem that the Progressive Party faded into nothingness—but that’s not quite true. Roosevelt’s running mate in 1912 was governor of California Hiram Johnson, who ran for reelection as governor in 1914 as a Progressive and got more votes than the Democratic and Republican candidates combined. In April 1916, John Parker ran as a Progressive candidate for governor of Louisiana, which was, according to a contemporary article in the Shreveport Times, a bid to boost Roosevelt’s power for the Republican convention coming up. Parker failed, but still got 37 percent of the vote (in 1912, the Republican gubernatorial candidate only got 8.78 percent). Such was his success that at the 1916 Progressive Convention, Parker was a natural pick for Vice Presidential candidate.

But what to do for president?

A mile away, at the same time the Progressives were having their convention, the Republicans were also having their convention—and the tone couldn’t have been more different. According to a contemporary account, the Progressive convention and Republican convention were “as different ... as champagne from ditch water boiled and sparkled and effervesced,” because the Republicans were torn between Charles Hughes, who “they would give their eye teeth not to take” and Roosevelt, who “they would not have.” The Progressives, however, were firm in a desire for Roosevelt.

To avoid a repeat of 1912, the Republicans and Progressives held a series of meetings to try and come up with a compromise candidate. According to historian Edmund Morris the Progressives were willing to give away virtually their entire plank in exchange for Roosevelt’s nomination, while the Republicans made it clear Roosevelt was not an option. At the end of the first ballot, Hughes was far ahead of Roosevelt but without a majority. Quickly Roosevelt realized he wouldn’t win, so suggested Henry Cabot Lodge as a compromise candidate. It came to naught and the Republicans chose Hughes. At almost the exact same time the Progressives chose Roosevelt to run for president again.

The only problem was that Roosevelt didn’t seem to want the nomination. “I am very grateful to the honor you confer upon me by nominating me as president," he wrote to the Progressive convention. "I cannot accept it at this time. I do not know the attitude of the candidate of the Republican party toward the vital questions of the day.” Roosevelt did suggest an out, that the Progressive National Committee could wait to see where the Republican candidate stood on the issues and if they were satisfied with what they heard they could accept Roosevelt’s refusal. If they weren’t satisfied, they could talk it over with Roosevelt and decide the next step.

A little over two weeks later, the Progressive National Committee in a vote of 32-6, with nine declining to vote, endorsed the Republican candidate. The New York Times declared, “The Progressive Party as a separate political organization died tonight.”

Except not really. There was still the issue of VP candidate John Parker. And Parker did campaign—largely against Hughes, and by inference for Wilson, although he explained that he’d “speak against Mr. Hughes’ candidacy. Of course, that would be in favor of Mr. Wilson, but I will speak as a Progressive and not as an affirmative supporter of the Democratic nominee.”

Come the election, the Progressive Party received 33,399 votes, down over 4 million from 1912.

In the days before the election, when it became clear Wilson was going to win, one of Roosevelt’s friends commented, “We can ... look forward to 1920. There will be nothing to it then but Roosevelt. No one can stop it.” To which Roosevelt replied “You are wrong there ... This was my year—1916 was my high twelve. In four years I will be out of it.”

Roosevelt died suddenly in 1919, but the Roosevelts weren’t out of the game yet. In 1920, Republican Warren G. Harding crushed James M. Cox as well as his vice-presidential nominee—Franklin Delano Roosevelt.

Fact-Checking 13 Plot Points in All Is True, Kenneth Branagh’s Shakespeare Biopic

Kenneth Branagh as William Shakespeare in All Is True (2019).
Kenneth Branagh as William Shakespeare in All Is True (2019).
Robert Youngson, Courtesy of Sony Pictures Classics

After being the face of Shakespeare film adaptations to a whole generation in films like Henry V (1989), Much Ado About Nothing (1993), Othello (1995), Hamlet (1996), and Love's Labour's Lost (2000), Kenneth Branagh has stepped into the shoes of the Bard himself. The British actor plays William Shakespeare in the new movie All Is True, which the five-time Oscar nominee also directed.

The film, which began rolling out in U.S. theaters on May 10, functions as a sequel of sorts to Shakespeare in Love. Call this one Shakespeare in Retirement. It depicts the Bard in the final few years of his life, which historians believe he mostly spent in Stratford-upon-Avon. Before his death in 1616, Shakespeare reunited with the wife and children he’d spent so much time away from while working in London.

All Is True takes its name from an alternate title used during Shakespeare’s lifetime for his play Henry VIII. The film frequently winks at its title, exploring the role of truth—or lack thereof—in the life of Branagh’s Will.

Spotty historical records leave many details about Shakespeare’s life in the realm of uncertainty, so filmmakers depicting the playwright must make use of broad artistic license to fill in the blanks. Mental Floss spoke with Harvard University professor and Will in the World: How Shakespeare Became Shakespeare author Stephen Greenblatt to fact-check All Is True. It turns out that the film’s depiction of Shakespeare is a mix of truth, presumed truth, and pure imagination.

1. Partially true: Shakespeare retired to Stratford-upon-Avon after the Globe burned down.

All Is True opens with the striking image of Will’s silhouette in front of a massive, crackling fire that destroys his prized playhouse. A title card tells viewers that at a performance of Shakespeare’s Life of Henry VIII (a.k.a. All Is True) at the Globe on June 29, 1613, during Act 1 Scene 4, a prop cannon misfired, starting the blaze. The next title card states, “The Globe Theatre burnt entirely to the ground. William Shakespeare never wrote another play.”

A prop cannon likely did misfire, and the resulting fire did destroy the Globe; while there were fortunately no deaths or serious injuries as a result, the fire delivered a serious financial blow to Shakespeare and other shareholders in the King's Men, the company of actors who performed at the Globe. But "never wrote another play" is a stretch. “The movie suggests he rode out of London, as it were, in the wake of the fire,” Greenblatt says. “But actually, it’s widely thought that he retired to Stratford before but he continued to write for the theater.”

The Tempest, for example, was likely the last play Shakespeare wrote solo, without a collaborator, and some scholars theorize he wrote it at home in Stratford-upon-Avon, not in London. Academics are divided as to which play was the final play Shakespeare ever wrote, but the general consensus is that it was either Henry VIII or The Two Noble Kinsmen, both collaborations with John Fletcher, which were possibly written during return trips to London.

2. True: Shakespeare’s daughter was accused of adultery.

Left to right: Jack Colgrave Hirst as Tom Quiney, Kathryn Wilder as Judith Shakespeare, Kenneth Branagh as William Shakespeare, Judi Dench as Anne Hathaway, Clara Ducz- mal as Elizabeth Hall, Lydia Wilson as Susanna Hall
Left to right: Jack Colgrave Hirst as Tom Quiney, Kathryn Wilder as Judith Shakespeare, Kenneth Branagh as William Shakespeare, Judi Dench as Anne Hathaway, Clara Duczmal as Elizabeth Hall, and Lydia Wilson as Susanna Hall in All Is True (2019).
Robert Youngson, Courtesy of Sony Pictures Classics

The film depicts a man named John Lane accusing Shakespeare’s eldest child, Susanna Hall, of adultery. That really happened, and the real-life Susanna Hall sued Lane in 1613 for slanderously saying that she had cheated on her husband with local man Ralph Smith.

As for whether Susanna Hall really did have an extramarital relationship with these men, that’s not known for sure, and the film leaves this somewhat up to viewer interpretation. But her real-life slander case did succeed in getting Lane excommunicated.

3. Likely true: Shakespeare had no schooling beyond age 14.

When a fanboy approaches Will with some eager questions, he says, “They say you left school at 14.” The line may be a bit misleading: Shakespeare did not quit school as a student would today if he "left school" at age 14. But it is true that boys in Shakespeare’s time completed grammar school at around age 14. They then could begin apprenticeships. Shakespeare’s schooling would have been intense, though: He would have been in lessons from 6 a.m. to as late at 6 p.m. six days a week, 12 months a year (getting an extra hour to sleep in only during the winter, when school started at 7 a.m. in the dark and cold months).

As Greenblatt wrote in Will in the World, “the instruction was not gentle: rote memorization, relentless drills, endless repetition, daily analysis of texts, elaborate exercises in imitation and rhetorical variation, all backed up by the threat of violence.”

No surviving records confirm that Shakespeare attended the school in Stratford-upon-Avon, but most scholars safely assume that he did. The grammar school there was free and accessible to all boys in the area, the exception being the children of the very poor, since they had to begin working at a young age.

Regarding the fanboy moment in the film, Greenblatt says, “The implication of that moment was precisely to remind us that [Shakespeare] didn’t go to university, as far as we know. I’m sure he didn’t. He would have bragged about it at some point" (as many of his contemporaries did).

4. Likely true: Susanna Hall was literate, while Shakespeare’s wife and younger daughter were not.

While boys received a formal education in Elizabethan and Jacobean England, girls did not. The film depicts Susanna as skillful at reading, unlike Will’s younger daughter, Judith, or his wife, Anne.

This is likely true: Greenblatt says that “the general sense is that Susanna was literate and that Judith and Anne were not,” though this is another area of Shakespeare’s family history that scholars cannot know for certain.

“This is a trickier matter than it looks,” Greenblatt says, “because lots of people in this period, including Shakespeare’s father, clearly knew how to read, but didn’t know how to write. This would be particularly the case for many women but not exclusively women in the period—that writing is a different skill from reading and that quite a few people were able to read.”

5. True: Shortly after his son’s death, Shakespeare wrote The Merry Wives of Windsor.

Judi Dench as Anne Hathaway and Kenneth Branagh as William Shakespeare in 'All Is True'
Judi Dench as Anne Hathaway and Kenneth Branagh as William Shakespeare in All Is True (2019).
Robert Youngson, Courtesy of Sony Pictures Classics

When Will insists that he did mourn Hamnet, his only son, who died in 1596 at age 11, Anne bites back, “You mourn him now. At the time you wrote Merry Wives of Windsor.”

It’s a gut-punch from Anne not just because Merry Wives (featuring the ever-entertaining character Falstaff) is a raucous comedy but also because it was, in the most cynical view, a cash grab. Shakespeare likely wrote Merry Wives after the Falstaff-featuring Henry IV Part 1 but before moving onto the grimmer Henry IV Part 2, “to tap an unexpected new market phenomenon,” scholars Martin Wiggins and Catherine Richardson wrote in British Drama, 1533-1642: A Catalogue regarding the "humours comedy," which debuted to immediate popularity in May 1597.

There is another way to interpret this: Both parts of Henry IV deal with a troubled father-son relationship, and the conclusion of Part 2 depicts a son taking up the mantle of his deceased father. Perhaps Prince Hal and King Henry hit too close to home for Will (who in this film hopes his son will follow in his poetic footsteps), and a lighthearted comedy is what he needed.

6. Very unlikely: The Earl of Southampton visited Shakespeare in Stratford-upon-Avon.

Henry Wriothesley, 3rd Earl of Southampton, was one of Shakespeare’s patrons, and Shakespeare included a lengthy dedication to Southampton in his poem The Rape of Lucrece. Despite that affiliation, the idea that Southampton (played by Ian McKellen, yet another acclaimed Shakespearean actor) would have visited Shakespeare’s home in Stratford is just “a piece of imagination,” according to Greenblatt. He points out that “it’s difficult to imagine any longer the social abyss” between an earl and someone like Shakespeare but explains, “The difference in social class is so extreme that the idea that the Earl would trot by on his horse to visit Shakespeare at his house is wildly unlikely.”

It is more likely that fellow playwright Ben Jonson would have visited Shakespeare, as he does later in the film.

7. Uncertain: Shakespeare’s sonnets were published “illegally and without [his] consent”

This is what Will reminds the Earl of Southampton of in the film. Regarding that term illegally, it’s worth first noting that though copyright law as we know it did not exist in 16th century England, “there definitely were legal controls over publication,” Greenblatt says.

“This is a notoriously complicated matter—the publication of the sonnets,” he explains. “It is still very much open to question. It’s not a settled matter as to whether Shakespeare did or did not have anything to do with the publication of those sonnets.”

8. Uncertain: Shakespeare wrote some of his sonnets for and about the Earl of Southampton.

Ian McKellen as Henry Wriothesley in 'All is True'
Ian McKellen as Henry Wriothesley in All is True (2019).
Robert Youngson, Courtesy of Sony Pictures Classics

One juicy debate about Shakespeare that endures is the question of who (if anyone) is the subject of his sonnets. Some speculate that his poems that describe a fair youth refer to the Earl of Southampton.

The film imagines a slightly more complicated—and perhaps more believable—situation than the idea that Southampton and Shakespeare had a fling: Will harbors feelings for Southampton, unrequited by the Earl, who reminds Will, “As a man, it is not your place to love me.”

“There is no way of achieving any certainty,” Greenblatt wrote in Will in the World regarding whether the sonnets were written as love tokens for anyone in particular. “After generations of feverish research, no one has been able to offer more than guesses, careful or wild.”

9. True: 3000 attendees could fit into the Globe for one performance.

In an elaborate, impressive clapback directed at Thomas Lucy, a local politician who repeatedly insults Will, the celebrated playwright cites his many responsibilities in London, then says he somehow “found time to write down the pretty thoughts you mentioned.”

It’s true that Shakespeare was both a businessman and poet. His status as a shareholder in the Lord Chamberlain’s Men (later the King’s Men) was actually unprecedented: “No other English literary playwright had ever held such a position,” Oxford professor Bart van Es wrote in Shakespeare in Company, adding that becoming part owner of the Globe, “the most impressive venue in London … placed him in a category entirely of his own.”

Among the accomplishments Will lists for Lucy is filling the Globe with “3000 paying customers per afternoon.”

“That is the upper end of the size of those public theaters, as far as we now know from archaeological evidence,” Greenblatt says. “Three thousand is at the high end, but yes. Whether they actually got 3000 people every afternoon is another question.”

Meanwhile, the reconstruction of the Globe that opened in London in 1997 has a capacity of about half that. Its dimensions are the same as the Globe of Shakespeare’s day but modern fire codes don’t allow playgoers to be packed in quite so tightly.

10. True: Shakespeare wrote Thomas Quiney out of his will.

The film depicts the retired playwright adding his son-in-law-to-be, Thomas Quiney, to his will in anticipation of Quiney's marriage to Will's youngest daughter, Judith. A couple of months later, Shakespeare amends his will again after it’s revealed that Quiney fathered a child by another woman before marrying Judith.

This may have really happened. Shakespeare summoned his lawyer in January 1616 to write Quiney into the will. Then in March, a month after his wedding, Quiney confessed in the vicar’s court to being responsible for the pregnancy of unmarried Stratford woman Margaret Wheeler, who had just died in childbirth (along with the child). Shakespeare then met again with his lawyer to strike out Quiney’s name and insert Judith’s name instead. However, some historians dispute that Shakespeare made this change as a result of the scandal; they instead suggest that it was due to practical concerns about Judith’s financial future.

All Is True reverses scholars’s common assumption that Shakespeare had a better relationship with Susanna’s husband, physician John Hall, than with Judith’s. It depicts Will’s removal of Quiney from his will as a reluctant necessity. “What the movie does is suggest [that John] Hall is an obnoxious, Puritan prig and that Thomas Quiney is actually a very nice fellow,” Greenblatt says.

One aspect of Shakespeare’s relationship with Hall that the film leaves out entirely is scholars’ assumption that Hall would have tended to the playwright during any sickness that led to his death. The cause of Shakespeare’s death is unknown, however, and Hall’s surviving casebooks date back only to 1617, the year after Shakespeare’s death.

11. Unlikely: Shakespeare’s family recited his verse at his funeral.

At what appears to be Will’s funeral, Anne, Judith, and Susanna (all with varying levels of literacy) read aloud the words of a dirge sung for the supposedly dead Imogen in Cymbeline. “Fear no more the heat o’ th’ sun,” they quote, “Thou thy worldly task hast done … All lovers young, all lovers must / Consign to thee and come to dust.”

The words are evocative of Scripture. (“Be not afraid” / “Have no fear” is said to be the most repeated phrase in both the Old Testament and the New Testament—and of course there’s the Genesis passage often read at funerals: “For dust thou art, and unto dust shalt thou return.”) Greenblatt says it is “very unlikely” that verse not from the Bible would have been recited at a funeral at the time of Shakespeare’s death, adding, “but I found that moment quite touching.”

SPOILER WARNING: The remainder of this article includes spoilers about some major twists in All Is True.

12. Uncertain: Shakespeare’s offspring wrote poetry.

Kathryn Wilder as Judith Shakespeare, Kenneth Branagh as William Shakespeare in All is True (2019)
Kathryn Wilder as Judith Shakespeare and Kenneth Branagh as William Shakespeare in All Is True (2019).
Robert Youngson, Courtesy of Sony Pictures Classics

In All Is True, when Will voices grief for his son who had died 17 years prior, he often references Hamnet’s apparent talent as a poet. “He showed such promise, Anne,” Will cries.

Branagh’s film imagines that Hamnet wrote poems full of wit and mischief. Then Judith drops the revelation that she actually crafted the poems, dictating them to her twin brother, who knew how to write. All Is True thus displaces the controversial authorship question from Shakespeare to his children.

“There’s no historical trace of any of this,” Greenblatt says. “That is just an invention.”

13. Uncertain: Hamnet Shakespeare died of the plague.

The other revelation that stuns Will in All Is True is about Hamnet’s death. Will looks at the record noting young Hamnet’s death and becomes suspicious about whether his only son really died of the plague. He confronts Anne and Judith, pointing out the small number of deaths in Stratford in the summer of 1596, saying that the plague strikes with “a scythe, not a dagger.” At this point, Judith confesses that her twin took his own life after she threatened to tell their father about the true author of the poems. She then tearfully recalls Hamnet, who did not know how to swim, stepping into a pond and drowning.

Though the historical record doesn't supply a cause of death for Hamnet, many historians assume he died of the bubonic plague. For the film's revelation about Hamnet’s suicide, which Greenblatt deems as another imaginative invention, Branagh and screenwriter Ben Elton seem to have taken inspiration from the real parish register recording burials at Holy Trinity Church in Stratford, which lists no more than two dozen burials between June and September 1596. Meanwhile, a plague epidemic hit Shakespeare’s hometown shortly after the poet’s birth in 1564 and lasted about six months, killing more than 200 people in Stratford, which was about a sixth of the population.

As Greenblatt points out, the storyline about Judith’s poems and Hamnet’s death serves as a commentary on Virginia Woolf’s compelling essay, “Shakespeare’s Sister,” which appears in A Room of One’s Own, published in 1929. The essay imagines a tragic story for Shakespeare’s fictional sister who is as gifted as her successful brother but is not permitted to go to school and whose parents scold her each time she picks up a book. “She was as adventurous, as imaginative, as agog to see the world as he was,” Woolf wrote.

Greenblatt observes that the central theme of All Is True seems to be “the tragic cost of not having full access to literacy if you were a woman.” He notes, though, that in Elizabethan and Jacobean England, “there were actually quite a few [literate] women, and the work of the last generation, particularly feminist scholars, have recovered a much larger field than Virginia Woolf could have understood or than the movie suggests, of women who were reading and writing in the period.”

Kenneth Branagh’s All Is True is in theaters now.

SECTIONS

arrow
LIVE SMARTER