Where Did the Myth That Radiation Glows Green Come From?


by C Stuart Hardwick

Probably from radium, which was widely used in self-luminous paint starting in 1908. When mixed with phosphorescent copper-doped zinc sulfide, radium emits a characteristic green glow:


The use of radioluminescent paint was mostly phased out by the mid-1960s. Today, in applications where it is warranted (like spacecraft instrument dials and certain types of sensors, for example), the radiation source is tritium (radioactive hydrogen) or an isotope of promethium, either of which has a vastly shorter half life than radium.

In most consumer products, though, radioluminescence has been replaced by photoluminescence, phosphors that emit light of one frequency after absorbing photons of a difference frequency. Glow-in-the-dark items that recharge to full brightness after brief exposure to sunlight or a fluorescent light only to dim again over a couple of hours are photoluminescent, and contain no radiation.

An aside on aging radium: By now, most radium paint manufactured early in the 20th century has lost most of its glow, but it’s still radioactive. The isotope of radium used has a half life of 1200 years, but the chemical phosphor that makes it glow has broken down from the constant radiation—so if you have luminescent antiques that barely glow, you might want to have them tested with a Geiger counter and take appropriate precautions. The radiation emitted is completely harmless as long as you don’t ingest or inhale the radium—in which case it becomes a serious cancer risk. So as the tell-tale glow continues to fade, how will you prevent your ancient watch dial or whatever from deteriorating and contaminating your great, great grandchildren’s home, or ending up in a landfill and in the local water supply?

Even without the phosphor, pure radium emits enough alpha particles to excite nitrogen in the air, causing it to glow. The color isn’t green, through, but a pale blue similar to that of an electric arc.


This glow (though not the color) entered the public consciousness through this early illustration of its appearance in Marie Curie’s lab, and became confused with the green glow of radium paints.

The myth is likely kept alive by the phenomenon of Cherenkov glow, which arises when a charged particle (such as an electron or proton) from submerged sources exceeds the local speed of light through the surrounding water.

So in reality, some radionuclides do glow (notably radium and actinium), but not as brightly or in the color people think. Plutonium doesn’t, no matter what Homer Simpson thinks, unless it’s Pu-238—which has such a short half life, it heats itself red hot.


This post originally appeared on Quora. Click here to view.

Why Do Hangovers Get Worse As You Get Older?


“I just can’t drink like I used to” is a common refrain among people pushing 30 and beyond. This is roughly the age when it starts getting harder to bounce back from a night of partying, and unfortunately, it keeps getting harder from there on out.

Even if you were the keg flip king or queen in college, consuming the same amount of beer at 29 that you consumed at 21 will likely have you guzzling Gatorade in bed the next day. It’s true that hangovers tend to worsen with age, and it’s not just because you have a lower alcohol tolerance from going out less. Age affects your body in various ways, and the way you process alcohol is one of them.

Because your body interprets alcohol as poison, your liver steps in to convert it into different chemicals that are easier to break down and eliminate from your body. As you get older, though, your liver produces less of the enzymes and antioxidants that help metabolize alcohol, according to a study from South Korea. One of these enzymes—called alcohol dehydrogenase (ADH)— has been called the “primary defense” against alcohol. It kicks off the multi-step process of alcohol metabolization by turning the beer or booze—or whatever you imbibed—into a chemical compound called acetaldehyde. Ironically, this substance is even more toxic than your tipple of choice, and a build-up of acetaldehyde can cause nausea, palpitations, and face flushing. It usually isn’t left in this state for long, though.

Another enzyme called aldehyde dehydrogenase (ALDH) helps convert the bad toxin into a new substance called acetate, which is a little like vinegar. Lastly, it’s converted into carbon dioxide or water and expelled from your body. You’ve probably heard the one-drink-per-hour recommendation, which is roughly how long it takes for your liver to complete this whole process.

So what does this mean for occasional drinkers whose mid-20s have come and gone? To summarize: As your liver enzymes diminish with age, your body becomes less efficient at metabolizing alcohol. The alcohol lingers longer in your body, leading to prolonged hangover symptoms like headaches and nausea.

This phenomenon can also partly be explained by the fact that our bodies tend to lose muscle and water over time. People with more body fat don’t break down alcohol as well, and less water in your body means that the booze stays concentrated in your system longer, The Cut reports. This is one of the reasons why women, who tend to have a higher body fat percentage than men, often suffer worse hangovers than their male counterparts. (Additionally, women have fewer ADH enzymes.)

More depressingly, as you get older, your immune system deteriorates through a process called immunosenescence. This means that recovering from anything—hangovers included—is more challenging with age. "When we get older, our whole recovery process for everything we do is harder, longer, and slower," gastroenterologist Mark Welton told Men’s Health.

This may seem like a buzzkill, but we're not telling you to put down the pint. However, if you're going to drink, just be aware of your body’s limitations. Shots of cotton candy-flavored vodka were a bad idea in college, and they’re an especially bad idea now. Trust us.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

What's the Difference Between a Break and a Fracture?


A lot of people tend to think that breaking a bone is worse than fracturing it—or perhaps they believe it's the other way around. Others may think of a fracture as a specific kind of break called a hairline crack. However, as Arkansas-based orthopedic surgeon Dr. C. Noel Henley points out in the YouTube video below, these are all common misconceptions. A fracture and a break are actually one and the same.

“There’s no difference between these two things,” he says. “A fracture means the cracking or breaking of a hard object. One is not worse than the other when it comes to breaking bones.”

Some of the confusion might stem from the fact that the word fracture is often used to describe specific kinds of breaks, as in compound fractures, oblique fractures, and comminuted fractures. In all cases, though, both break and fracture refer to any instance where “the normal structure of the bone has been disrupted and damaged,”  Henley notes.

This isn’t the only common misconception when it comes to cracked bones. The idea that a “clean break” is a good thing when compared to the alternative is a myth. Using the scaphoid bone in the wrist as an example, Dr. Henley says a clean break in the “wrong” bone can still be very, very bad. In some cases, surgery might be necessary.

According to the BBC, other bone myths include the belief that you’ll be unable to move a certain body part if your bone is broken, or that you’ll instantly know if you have a fracture because it will hurt. This isn’t always the case, and some people remain mobile—and oblivious to their injury—for some time after it occurs. Even if you think you have a minor sprain or something seemingly small like a broken toe, it’s still a good idea to see a doctor. It could be more serious than you realize.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.