CLOSE
iStock
iStock

Listen for Whales and Dolphins in This Live Stream from the Deep Ocean

iStock
iStock

If you want an alternative to the video game soundtrack or ambient noise generator you usually listen to at work, the Monterey Bay Aquarium Research Institute (MBARI) has got you covered. As Mashable reports, scientists there have set up a live stream of the deep ocean sounds recorded 18 miles off the coast of Monterey Bay, California.

The eerie noises of the ocean's depths are captured using an underwater microphone called a hydrophone, which scientists planted 3000 feet beneath the surface in 2015. For more than two years, researchers have used the instrument to eavesdrop on the activity taking place in the surrounding waters, and now they're giving the public the opportunity to do the same by streaming the recordings live on YouTube.

The audio (which is delayed 20 minutes for processing) mostly consists of white noise, but listen long enough and you'll eventually hear signs of life, both human and animal. The microphone picks up the moans of humpback whales, the squeaks of dolphins, and even sounds from the surface like wind, waves, and boats passing overhead. The high-pitched noises likely traveled no more than a few miles to reach the hydrophone, while lower sounds may have originated a much greater distance from where they were recorded.

The deep-sea recordings can help researchers better understand the secret lives of marine animals, but they can also provide a soothing soundtrack for anyone needing background music. If you aren't patient enough to wait for the live animal calls, you can visit MBARI's listening room to pick out individual audio tracks.

[h/t Mashable]

nextArticle.image_alt|e
Getty Images
arrow
entertainment
Why Our Brains Love Plot Twists
Getty Images
Getty Images

From the father-son reveal in The Empire Strikes Back to the shocking realization at the end of The Sixth Sense, everyone loves a good plot twist. It's not the element of surprise that makes them so enjoyable, though. It's largely the set-up, according to cognitive scientist Vera Tobin.

Tobin, a researcher at Case Western Reserve University, writes for The Conversationthat one of the most enjoyable moments of a film or novel comes after the big reveal, when we get to go back and look at the clues we may have missed. "The most satisfying surprises get their power from giving us a fresh, better way of making sense of the material that came before," Tobin writes. "This is another opportunity for stories to turn the curse of knowledge to their advantage."

The curse of knowledge, Tobin explains, refers to a psychological effect in which knowledge affects our perception and "trips us up in a lot of ways." For instance, a puzzle always seems easier than it really is after we've learned how to solve it, and once we know which team won a baseball game, we tend to overestimate how likely that particular outcome was.

Good writers know this intuitively and use it to their advantage to craft narratives that will make audiences want to review key points of the story. The end of The Sixth Sense, for example, replays earlier scenes of the movie to clue viewers in to the fact that Bruce Willis's character has been dead the whole time—a fact which seems all too obvious in hindsight, thanks to the curse of knowledge.

This is also why writers often incorporate red herrings—or false clues—into their works. In light of this evidence, movie spoilers don't seem so terrible after all. According to one study, even when the plot twist is known in advance, viewers still experience suspense. Indeed, several studies have shown that spoilers can even enhance enjoyment because they improve "fluency," or a viewer's ability to process and understand the story.

Still, spoilers are pretty universally hated—the Russo brothers even distributed fake drafts of Avengers: Infinity War to prevent key plot points from being leaked—so it's probably best not to go shouting the end of this summer's big blockbuster before your friends have seen it.

[h/t The Conversation]

nextArticle.image_alt|e
iStock
arrow
Medicine
The 98.6℉ Myth: Why Everything You Think You Know About Body Temperature Is a Lie
iStock
iStock

When you were kid, you probably knew that to score a magical sick day home from school, you needed to have a fever. When the thermometer came out of your mouth, it had to read higher than 98.6℉—the long-accepted "normal" human body temperature. (If you wanted to really seal the deal, you may have hoped to hit 100℉.) Since then, you may have used a temperature above 98.6℉ as a metric to work from home (or call out sick entirely).

But here's the thing: The average body temperature isn't actually 98.6℉—a fact that we've known for more than 25 years. The myth originated in the 19th century with a single doctor, and despite evidence to the contrary, it's persisted ever since.

THE GIANT—AND FAULTY—ARMPIT THERMOMETER

In 1851, Carl Wunderlich, the director of the hospital at Leipzig University, began going from room to room with a comically large thermometer in tow. He wanted to understand how body temperature is affected by different diseases, so in each room, he would hold the foot-long device in patients' armpits for a full 20 minutes, waiting for a temperature to register. Once it did, he'd note the temperature on the patient's chart (Wunderlich is thought to be the first physician to do so). He and his staff did this for years, repeatedly taking the temperatures of some 25,000 patients and logging them on their charts, until he had millions of readings. In 1868, he finally published this data in Das Verhalten der Eigenwarme in Krankheiten (On the Temperature in Diseases: A Manual of Medical Thermometry). He concluded that the average human body temperature was 98.6℉, underscoring the idea that fever is a symptom of illness, not a cause.

No one questioned Wunderlich's methods, or his average, for about 140 years. Then, in the early 1990s, internist Philip Mackowiak—a professor of medicine at the University of Maryland, a medical historian, and, apparently, a clinical thermometer junkie—saw one of the physician's instruments at the Mutter Museum in Philadelphia. He told the Freakonomics podcast that he'd always had doubts about the 98.6℉ standard. "I am by nature a skeptic," he said. "And it occurred to me very early in my career that this idea that 98.6 was normal, and then if you didn't have a temperature of 98.6, you were somehow abnormal, just didn't sit right."

Getting his hands on Wunderlich's thermometer—which the museum let him borrow—only deepened his doubts. The huge thermometer was unwieldy and non-registering, meaning, Mackowiak explained, "that it has to be read while it's in place." Not only that, but Wunderlich had used the device to measure temperatures in the armpit, which is less reliable than temperatures taken in the mouth or rectum. The instrument itself also wasn't terribly precise: It measured up to 2 degrees Centigrade higher than both ancient and modern instruments.

In 1992, Mackowiak decided to test Wunderlich's average. Using normal-sized oral thermometers and a group of volunteers, he determined that the average human body temperature actually hovers around 98.2℉. Mackowiak found that body temperature tends to vary over the course of the day, with its lowest point around 6 a.m. and its highest in the early evening. Body temperature can also fluctuate monthly (with the menstrual cycle) and over a lifetime (declining decade by decade with age), and may even be differentially linked to sex and race assignments. He concluded that normal body temperature is so unique to each person that it's almost like a fingerprint and, given that wide variation, not actually a very reliable indicator of illness.

As a result of his study, Mackowiak proposed raising the threshold for fever to 98.9℉ for temperatures taken in the morning (and 99.9℉ at other times). While it's a relatively minor change in terms of actual degrees, this fever threshold is actually lower than the CDC's, which is a temperature of 100.4℉ or higher.

There are potential real-life consequences in this gap, for everyone from students (who'd have to attend school with what would be considered a low-grade fever by Wunderlich's 98.6℉ standard) to employers and daycares (who use temperature to set attendance policies). What's more, anyone who is actually sick but ignores a low-grade fever—one that meets Mackowiak's threshold but still falls under the CDC's—could pose a risk to people with compromised immune systems trying to avoid unnecessary exposure to illness in public places.

THE BALANCING POINT

There's a reason the average trends near 98℉ instead of 92℉ or 106℉. As endotherms, mammals expend a great deal of energy maintaining body temperature when compared with cold-blooded creatures. To find and conserve a just-right body temperature, central nervous system sensors gather data (too warm? too cold? just right, Goldilocks?) and send that information to the pebble-sized hypothalamus near the base of the brain. There, the data is converted into action: releasing sweat and widening the blood vessels if too warm; raising metabolism, constricting the blood vessels, and inducing shivering if too cold.

According to a study by Aviv Bergman and Arturo Casadevall in the journal mBio, the precise balancing point for ideal body temperature is the sweet spot where the metabolic cost for all this thermoregulation balances with the evolutionary advantage of warding off fungal disease. (While warm-blooded animals are prone to bacterial or viral infections, they rarely experience fungal infections because most fungi can't withstand temperatures above 86℉. Cold-blooded animals, on the other hand, are prone to all three.) For Bergman and Casadevall, this benefit even explains what tipped Darwin's scales in favor of mammals, allowing them to edge out other vertebrates for dominance after the Cretaceous-Tertiary mass extinction wiped out the dinosaurs.

Of course, rules call for exceptions, and the one place where human body temperature demonstrates sustained elevation is outer space. Astronauts on prolonged missions clock significantly higher average body temperatures than they do when terrestrial—even up to 104℉. This so-called "space fever" is probably a product of some combination of radiation exposure, psychological stress, and immune response to weightlessness. Researchers believe this phenomenon could yield crucial information about thermoregulation—and may even offer insight into how humans might adapt to climate change.

WHY THE MYTH PERSISTS

It's been 26 years since Mackowiak's study, yet the newer data has not taken hold among medical professionals or the public. What gives?

Mackowiak tells Mental Floss that he finds it a bit mystifying that the myth persists, especially since many people, when pressed, know that the so-called "average" temperature varies. Part of the problem may be psychological: We cling to beliefs despite evidence to the contrary—a phenomenon called belief perseverance [PDF]. It's a significant force upholding a surprising number of medical myths. The idea humans should drink eight glasses of water a day? Not science. Sugar causes hyperactive behavior? Nope. Reading in dim light harms eyesight? Not really.

Unlearning persistent myths—especially ones loaded with the weight of medical authority—is difficult. "Deep down, under it all," Mackowiak says, "people want simple answers for things."

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios