Why is the Drinking Age 21?

iStock
iStock

In short, we ended up with a national minimum age of 21 because of the National Minimum Drinking Age Act of 1984. This law basically told states that they had to enact a minimum drinking age of 21 or lose up to 10 percent of their federal highway funding. Since that's some serious coin, the states fell into line fairly quickly. Interestingly, this law doesn't prohibit drinking per se; it merely cajoles states to outlaw purchase and public possession by people under 21. Exceptions include possession (and presumably drinking) for religious practices, while in the company of parents, spouses, or guardians who are over 21, medical uses, and during the course of legal employment.

That answers the legal question of why the drinking age is 21, but what was the underlying logic of the original policy? Did lawmakers just pick 21 out of a hat because they wanted college seniors to learn the nuances of bar culture before graduation? Not quite. The concept that a person becomes a full adult at age 21 dates back centuries in English common law; 21 was the age at which a person could, among other things, vote and become a knight. Since a person was an official adult at age 21, it seemed to make sense that they could drink then, too.

WHO WAS RESPONSIBLE FOR LOWERING THE DRINKING AGE TO 18 FOR PART OF THE 20TH CENTURY, THOUGH?

Believe it or not, Franklin Roosevelt helped prompt the change in a rather circuitous fashion. FDR approved lowering the minimum age for the military draft from 21 to 18 during World War II. When the Vietnam-era draft rolled around, though, people were understandably a bit peeved that 18-year-old men were mature enough to fight, but not old enough to vote. Thus, in 1971 the states ratified the 26th Amendment, which lowered the voting age to 18. Legislators started applying the same logic to drinking. The drinking age, which the 21st Amendment made the responsibility of individual states, started dropping around the country.

Critics of the change decried rises in alcohol-related traffic fatalities among 18- to 20-year-old drivers in areas where the drinking age had been lowered. Indeed, one result of leaving states in charge of their own age was the creation of "blood borders" between states that allowed 18-year-olds to drink and those that didn't. Teenagers from the more restrictive state would drive into the one where they could buy booze, drink, and then drive home, which created a perfect storm for traffic fatalities. Even if teens weren't any more predisposed than older adults to drive after they'd been drinking, all of this state-hopping meant that those who did drive drunk had to drive greater distances to get home than their older brethren, who could just slip down the block for a beer or six. More miles logged in a car meant more opportunities for a drunken accident.

WHO LED THE BACK-TO-21 MOVEMENT?

Organizations like Mothers Against Drunk Driving began agitating for a uniform national drinking age of 21 to help eliminate these blood borders and keep alcohol out of the hands of supposedly less-mature 18-year-olds. As a result, President Reagan signed the aforementioned National Minimum Drinking Age Act of 1984. MADD's "Why 21?" website touts that, "More than 25,000 lives have been saved in the U.S. thanks to the 21 Minimum Legal Drinking Age." Traffic reports show a 61 percent decrease in alcohol-related fatalities among drivers under 21 between 1982 and 1998. Raw numbers show that drunk driving fatalities have definitely dropped since the early 1980s; since 1982, drunk driving fatalities have decreased 51 percent. Among drivers under 21, drunk driving-related deaths have decreased by 80 percent.

Teasing out the underlying cause of this reduction in total fatalities is no mean feat, though. Non-alcohol traffic fatalities have also declined relative to the number of miles driven over the same time period, which could be attributed to any number of causes, including increased seat belt usage, the widespread use of airbags, and other safety improvements to cars and roads. Moreover, drinking and driving for the whole population might be down as the result of increased education on its consequences, harsher penalties, improved enforcement, or increased stigmatization of drunk driving.

College presidents who supported the Amethyst Initiative—a movement launched in 2008 to reconsider the national drinking age of 21—admit that drunk driving is a serious problem, but they point out that it's not the only potential pitfall for young drinkers. They contend that by lowering the drinking age, colleges would be able to bring booze out into the open and educate students on responsible consumption. Such education might help curb alcohol poisoning, drunken injuries, drinking-fueled violence, and alcoholism on campuses.

Interesting bit of trivia: the group takes its name from the character Amethyst in Greek mythology. She ran afoul of a drunken Dionysus, who had her turned into white stone. When the god discovered what he'd done, he poured wine on the stone, turning it into the purple rock we know as amethyst. Ancient Greeks wore the mineral as a form of protection from drunkenness.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

What Does CPR Stand For?

undefined undefined/iStock via Getty Images
undefined undefined/iStock via Getty Images

The life-saving technique known as CPR stands for cardiopulmonary resuscitation. It's a method that allows oxygenated blood to temporarily circulate throughout the body of a person whose heart has stopped. When the heart ceases beating during cardiac arrest, lungs stop receiving oxygen. Without oxygen, nerve cells start to die within minutes; it can take just four to six minutes for an oxygen-deprived person to sustain permanent brain damage or die.

The cardio part of the phrase refers to the heart, the muscular organ that pumps blood through the body's circulatory system. Pulmonary involves the lungs. People take approximately 15 to 20 breaths per minute, and with each breath you take, your lungs fill with oxygen. Resuscitation means bringing something back to consciousness, or from the brink of death.

We have two physicians, Peter Safar and James Elam, to thank for developing mouth-to-mouth resuscitation in the mid-1950s. In 1957, the American military adopted their CPR method for reviving soldiers. In 1960, the American Heart Association integrated chest compressions, which keep the blood circulating.

Doctors, nurses, dentists, first responders, lifeguards, and some teachers are required to be certified in CPR. But because approximately 85 percent of cardiac arrests occur at home, it’s smart for the average person to know how to perform it, too. In school, you were probably taught CPR by the traditional method of giving 100 to 120 chest compressions per minute (play the Bee Gees’ "Stayin’ Alive" in your head to keep the beat) and mouth-to-mouth resuscitation. Today, the American Heart Association recommends that average people learn hands-only CPR, which simply involves chest compressions. The organization has found that people can be reluctant to administer mouth-to-mouth CPR in an emergency because they're afraid of doing it wrong or injuring the patient. With hands-only CPR, bystanders feel less anxiety and more willingness to jump in. The AHA also notes that hands-only CPR can be just as effective in saving a life. (And any CPR is better than none at all.)

But how many people actually know CPR?

In 2018, a Cleveland Clinic survey found that 54 percent of Americans said they knew CPR, but only one in six people knew that bystander CPR requires only chest compressions. Only 11 percent of people knew the correct pace for compressions. Again, singing "Stayin' Alive" to yourself is one way to remember the pace—though being a fan of The Office can apparently help, too (as one lucky life-saver recently discovered).

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Are Left-Handed People Really More Creative?

Kuzma/iStock via Getty Images
Kuzma/iStock via Getty Images

The left-handed brand has come a long way in the last few decades. The majority of people no longer assume that southpaws are tools of Satan, alight with hellfire. Today’s lefties are surrounded by a far more benevolent glow. We associate left-handedness with intelligence, out-of-the-box thinking, and artistic talent. But are these flattering generalizations backed up by science? Does being left-handed really make you more creative? 

The answer to that is a definitive … maybe.

Scientists have been chipping away at the peculiarities of left-handedness, which occurs in about 10 percent of the population, for a long time. They’ve looked into the purported links between left-handedness and things like mental illness, faulty immune systems, and criminal behavior. They’ve studied whether lefties are better at problem-solving, and if they’re more likely to die young. From all these studies on left-handedness, we can conclude one thing, and one thing alone: science is complicated. 

A handful of studies have found a link between left-handedness and creativity, conferred (some think) by the fact that left-handed folks constantly have to adjust to a right-handed world. Other studies found no link at all. 

Some researchers conclude that lefties are no smarter than righties, while others say that left-handedness comes with a clear intellectual advantage. Is there really a left-handed personality? Are lefties more prone to schizophrenia and learning disabilities? That depends on who you ask. 

But "Are lefties different?" might not even be the right question. Over the last few years, a number of studies have concluded that it’s not which hand is dominant that matters—it’s the degree of dominance. According to researchers, very few people are truly entirely left- or right-handed; it’s more of a spectrum. We use our left hands for some things and our right hands for other tasks. 

These experiments have found that people toward the middle of the spectrum are more flexible thinkers. They seem to be more empathetic and better able to view things from other people’s perspectives. When considering the risks and benefits of any given decision, inconsistent-handed people (as researchers call them) are more likely to focus on the risks, whereas people at the outer edges of the handedness spectrum pay more attention to potential benefits. They may even sleep differently. It seems we’ve been aiming our stereotypes a little too far to the left.

But who knows? This is ever-changing, constantly evolving science. If you’re a lefty who enjoys feeling superior, we’re not going to tell you to tone it down. For all we know, you could be right.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER