CLOSE
istock
istock

15 of History's Greatest Mad Scientists

istock
istock

When it comes to scientists, brilliance and eccentricity seem to go hand in hand. Some of the most innovative minds in human history have also been the strangest. From eccentric geniuses to the downright insane, here are some of history’s greatest mad scientists.

1. JOHANN CONRAD DIPPEL 

Born in Castle Frankenstein in 1673, Johann Conrad Dippel was a theologian, alchemist, and scientist who developed a popular dye called Prussian Blue that is still used to this day. But Dippel is better remembered for his more controversial experiments. He mixed animal bones and hides together in a stew he called “Dippel’s Oil,” which he claimed was an elixir that could extend the lifespan of anyone who consumed it. He also loved dissecting animals, and some believe he even stole human bodies from Castle Frankenstein. Dippel is often cited as an inspiration for Mary Shelley’s Frankenstein, though the claim remains controversial.

2. GIOVANNI ALDINI 

Another possible Frankenstein inspiration was mad scientist Giovanni Aldini, who among other strange experiments, was obsessed with the effects of electrocution. Aldini, who was something of a celebrity in the early 19th century, travelled Europe, demonstrating the powers of electricity. He was also one of the first scientists to treat mental patients with electric shocks. Though his methods were unconventional, Aldini was well respected in his time, and the emperor of Austria even made him a Knight of the Iron Crown. 

3. WILLIAM BUCKLAND 

Nineteenth century theologian and paleontologist William Buckland was the first person to write a full description of a fossilized dinosaur, which he called the Megalosaurus. But though his work was admired, the early paleontologist had some pretty strange appetites: Buckland was obsessed with trying to eat his way through the entire animal kingdom. He claimed to have consumed mice, porpoises, panthers, bluebottle flies, and even the preserved heart of King Louis XIV.

4. PYTHAGORAS

Anyone who took high school math knows about the Pythagorean theorem. But they might not know that, in addition to being a brilliant mathematician, Pythagoras really hated eating beans. If that sounds more like a personal preference than a mark of madness, consider the fact that he not only avoided eating legumes, but that he went so far as to forbid his followers from eating them as well. It’s unclear where Pythagoras’s bean aversion came from, though some believe Pythagoras saw them as sacred. According to one legend, Pythagoras died when he was being pursued by a group of ruffians, but refused to seek refuge in a nearby bean field. 

5. BENJAMIN BANNEKER 

Eighteenth century engineer, astronomer, and professional tinkerer Benjamin Banneker is believed to have made the first clock built entirely in America. Banneker helped survey the boundaries of the area that would become Washington D.C., charted the stars and planets every night, predicted eclipses, and was one of America’s earliest African American scientists. How did he make time to do all that? By working all night, and sleeping only in the early hours of the morning, of course. The quirky scientist was said to spend each night wrapped in a cloak, lying under a pear tree, meditating on the revolutions of heavenly bodies. Instead of in a lab or office, the astronomer dozed where he could also (potentially) do work: beneath a tree. 

6. ISAAC NEWTON 

One of the most influential scientists in history, Isaac Newton was also one of the quirkiest. The physicist and mathematician was known to experiment on himself while studying optics, even going so far as to poke himself in the eye with a needle. He was also obsessed with the apocalypse and believed the world would end sometime after the year 2060. 

7. LADY MARGARET CAVENDISH

One of England’s first female natural philosophers, Margaret Cavendish was a controversial figure in the 17th century. An outspoken intellectual and prolific writer, she ruffled a few feathers among those who believed women had no place in the scientific community. As a result, Cavendish was often called “Mad Madge.” But though Cavendish wasn’t truly insane, she was more than a little socially inept. On one occasion, Cavendish was “pondering upon the natures of Mankind,” and decided to write down all of the positive qualities possessed by one of her friends on one piece of paper, and on another, all of the woman’s negative qualities. Cavendish then decided to send her friend the list of positive qualities, which she assumed would be appreciated. Unfortunately, Cavendish accidentally sent the wrong list, and received an outraged response from her friend. Cavendish also acted as her own physician, and likely died as a result of her refusal to seek outside medical care.

8. SHEN KUO 

One of the most renowned scholars of the Northern Song Dynasty, Shen Kuo was a master of astronomy, physics, math, and geology, arguing, among other things, that tides are caused by the moon’s gravitational pull and that the Earth and the Sun are spherical, not flat. But he’s also credited as the first writer to describe a UFO sighting. Shen documented sightings of unidentified flying objects in his writing, describing the descent of floating objects “as bright as a pearl.” Nowadays, contemporary UFO theorists have latched onto Shen’s work as the first written record of an alien spacecraft. Shen himself never made that connection: Generally speaking, he was more interested in divination and the supernatural than alien visitors. 

9. TYCHO BRAHE

A great astronomer and an even greater partier, Tycho Brahe was born in Denmark in 1546, and lost his nose in a mathematical disagreement that elevated to a brawl. The scientist spent the rest of his life wearing a copper prosthetic nose. Brahe also threw elaborate parties on his own private island, had a court jester who sat under the table at banquets, and kept a pet elk who loved to imbibe just as much as he did. 

10. MARY ANNING 

Mary Anning was a mad fossil collector: Starting at age 12, Anning became obsessed with finding fossils and piecing them together. Driven by acute intellectual curiosity as well as economic incentives (the working class Anning sold most of the fossils she discovered), Anning became famous among 19th century British scientists. So many people would travel to her home in Lyme Regis to join her on her fossil hunts that after she died locals actually noticed a drop in tourism to the region. But it’s not Anning’s passion for fossils that sets her apart as a slightly mad scientist, but rather the supposed origins of her intellectual curiosity: As an infant, the sickly young Mary was struck by lightning while watching a traveling circus. That lightning strike, according to Anning’s family, was at the root of the once-unexceptional Mary’s superior intelligence. 

11. ATHANASIUS KIRCHER

Sometimes called the “Master of a Hundred Arts,” Athanasius Kircher was a polymath who studied everything from biology and medicine to religion. But Kircher didn’t just study everything, he seems to have believed in everything as well. At a time when scientists like Rene Descartes were becoming increasingly skeptical of mythological phenomena, Kircher believed strongly in the existence of fictional beasts and beings like mermaids, giants, dragons, basilisks, and gryphons.

12. LUCRETIUS

In contrast to Anthanasius Kircher, Ancient Roman poet and scientist Lucretius spent much of his life trying to disprove the existence of mythological beasts. But he employed some truly creative logic to do so. Lucretius is best known for being one of the earliest scientists to write about atoms. But he also argued that centaurs and other mythological animal mash-ups were impossible because of the different rates at which animals aged. A centaur, for instance, could never exist according to Lucretius, because horses age much faster than humans. As a result, for much of its lifespan, a centaur would be running around with the head and torso of a human baby on top of a fully grown horse’s body. 

13. STUBBINS FFIRTH 

While training to become a doctor at the University of Pennsylvania, Stubbins Ffirth became obsessed with proving yellow fever was not contagious. In order to do so, the young researcher would expose himself to the bodily fluids of yellow fever patients. Ffirth never caught yellow fever, though contemporary scientists know that this was not because the disease isn’t contagious (it is), but because most of the patients whose samples he used were in the late stages of the disease, and thus, past the point of contagion. 

14. PARACELSUS 

Renaissance era scientist Paracelsus is sometimes called the “father of toxicology.” But he also thought he could create a living homunculus (a living, miniature person) from the bodily fluids of full-sized people. He also believed in mythological beings like wood nymphs, giants, and succubae. 

15. LEONARDO DA VINCI

Though he’s best known as an artist, Leonardo thought up some pretty amazing inventions. From an early version of the airplane to a primitive scuba suit, Leonardo designed technological devices that are in use to this day. But Leonardo wasn’t your average inventor: He had no formal schooling, dissected animals to learn about their anatomy, loved designing war devices, and recorded many of his best ideas backwards in mirror image cursive, possibly to protect his works from plagiarism.

nextArticle.image_alt|e
Getty Images
arrow
entertainment
Why Our Brains Love Plot Twists
Getty Images
Getty Images

From the father-son reveal in The Empire Strikes Back to the shocking realization at the end of The Sixth Sense, everyone loves a good plot twist. It's not the element of surprise that makes them so enjoyable, though. It's largely the set-up, according to cognitive scientist Vera Tobin.

Tobin, a researcher at Case Western Reserve University, writes for The Conversationthat one of the most enjoyable moments of a film or novel comes after the big reveal, when we get to go back and look at the clues we may have missed. "The most satisfying surprises get their power from giving us a fresh, better way of making sense of the material that came before," Tobin writes. "This is another opportunity for stories to turn the curse of knowledge to their advantage."

The curse of knowledge, Tobin explains, refers to a psychological effect in which knowledge affects our perception and "trips us up in a lot of ways." For instance, a puzzle always seems easier than it really is after we've learned how to solve it, and once we know which team won a baseball game, we tend to overestimate how likely that particular outcome was.

Good writers know this intuitively and use it to their advantage to craft narratives that will make audiences want to review key points of the story. The end of The Sixth Sense, for example, replays earlier scenes of the movie to clue viewers in to the fact that Bruce Willis's character has been dead the whole time—a fact which seems all too obvious in hindsight, thanks to the curse of knowledge.

This is also why writers often incorporate red herrings—or false clues—into their works. In light of this evidence, movie spoilers don't seem so terrible after all. According to one study, even when the plot twist is known in advance, viewers still experience suspense. Indeed, several studies have shown that spoilers can even enhance enjoyment because they improve "fluency," or a viewer's ability to process and understand the story.

Still, spoilers are pretty universally hated—the Russo brothers even distributed fake drafts of Avengers: Infinity War to prevent key plot points from being leaked—so it's probably best not to go shouting the end of this summer's big blockbuster before your friends have seen it.

[h/t The Conversation]

nextArticle.image_alt|e
iStock
arrow
Medicine
The 98.6℉ Myth: Why Everything You Think You Know About Body Temperature Is a Lie
iStock
iStock

When you were kid, you probably knew that to score a magical sick day home from school, you needed to have a fever. When the thermometer came out of your mouth, it had to read higher than 98.6℉—the long-accepted "normal" human body temperature. (If you wanted to really seal the deal, you may have hoped to hit 100℉.) Since then, you may have used a temperature above 98.6℉ as a metric to work from home (or call out sick entirely).

But here's the thing: The average body temperature isn't actually 98.6℉—a fact that we've known for more than 25 years. The myth originated in the 19th century with a single doctor, and despite evidence to the contrary, it's persisted ever since.

THE GIANT—AND FAULTY—ARMPIT THERMOMETER

In 1851, Carl Wunderlich, the director of the hospital at Leipzig University, began going from room to room with a comically large thermometer in tow. He wanted to understand how body temperature is affected by different diseases, so in each room, he would hold the foot-long device in patients' armpits for a full 20 minutes, waiting for a temperature to register. Once it did, he'd note the temperature on the patient's chart (Wunderlich is thought to be the first physician to do so). He and his staff did this for years, repeatedly taking the temperatures of some 25,000 patients and logging them on their charts, until he had millions of readings. In 1868, he finally published this data in Das Verhalten der Eigenwarme in Krankheiten (On the Temperature in Diseases: A Manual of Medical Thermometry). He concluded that the average human body temperature was 98.6℉, underscoring the idea that fever is a symptom of illness, not a cause.

No one questioned Wunderlich's methods, or his average, for about 140 years. Then, in the early 1990s, internist Philip Mackowiak—a professor of medicine at the University of Maryland, a medical historian, and, apparently, a clinical thermometer junkie—saw one of the physician's instruments at the Mutter Museum in Philadelphia. He told the Freakonomics podcast that he'd always had doubts about the 98.6℉ standard. "I am by nature a skeptic," he said. "And it occurred to me very early in my career that this idea that 98.6 was normal, and then if you didn't have a temperature of 98.6, you were somehow abnormal, just didn't sit right."

Getting his hands on Wunderlich's thermometer—which the museum let him borrow—only deepened his doubts. The huge thermometer was unwieldy and non-registering, meaning, Mackowiak explained, "that it has to be read while it's in place." Not only that, but Wunderlich had used the device to measure temperatures in the armpit, which is less reliable than temperatures taken in the mouth or rectum. The instrument itself also wasn't terribly precise: It measured up to 2 degrees Centigrade higher than both ancient and modern instruments.

In 1992, Mackowiak decided to test Wunderlich's average. Using normal-sized oral thermometers and a group of volunteers, he determined that the average human body temperature actually hovers around 98.2℉. Mackowiak found that body temperature tends to vary over the course of the day, with its lowest point around 6 a.m. and its highest in the early evening. Body temperature can also fluctuate monthly (with the menstrual cycle) and over a lifetime (declining decade by decade with age), and may even be differentially linked to sex and race assignments. He concluded that normal body temperature is so unique to each person that it's almost like a fingerprint and, given that wide variation, not actually a very reliable indicator of illness.

As a result of his study, Mackowiak proposed raising the threshold for fever to 98.9℉ for temperatures taken in the morning (and 99.9℉ at other times). While it's a relatively minor change in terms of actual degrees, this fever threshold is actually lower than the CDC's, which is a temperature of 100.4℉ or higher.

There are potential real-life consequences in this gap, for everyone from students (who'd have to attend school with what would be considered a low-grade fever by Wunderlich's 98.6℉ standard) to employers and daycares (who use temperature to set attendance policies). What's more, anyone who is actually sick but ignores a low-grade fever—one that meets Mackowiak's threshold but still falls under the CDC's—could pose a risk to people with compromised immune systems trying to avoid unnecessary exposure to illness in public places.

THE BALANCING POINT

There's a reason the average trends near 98℉ instead of 92℉ or 106℉. As endotherms, mammals expend a great deal of energy maintaining body temperature when compared with cold-blooded creatures. To find and conserve a just-right body temperature, central nervous system sensors gather data (too warm? too cold? just right, Goldilocks?) and send that information to the pebble-sized hypothalamus near the base of the brain. There, the data is converted into action: releasing sweat and widening the blood vessels if too warm; raising metabolism, constricting the blood vessels, and inducing shivering if too cold.

According to a study by Aviv Bergman and Arturo Casadevall in the journal mBio, the precise balancing point for ideal body temperature is the sweet spot where the metabolic cost for all this thermoregulation balances with the evolutionary advantage of warding off fungal disease. (While warm-blooded animals are prone to bacterial or viral infections, they rarely experience fungal infections because most fungi can't withstand temperatures above 86℉. Cold-blooded animals, on the other hand, are prone to all three.) For Bergman and Casadevall, this benefit even explains what tipped Darwin's scales in favor of mammals, allowing them to edge out other vertebrates for dominance after the Cretaceous-Tertiary mass extinction wiped out the dinosaurs.

Of course, rules call for exceptions, and the one place where human body temperature demonstrates sustained elevation is outer space. Astronauts on prolonged missions clock significantly higher average body temperatures than they do when terrestrial—even up to 104℉. This so-called "space fever" is probably a product of some combination of radiation exposure, psychological stress, and immune response to weightlessness. Researchers believe this phenomenon could yield crucial information about thermoregulation—and may even offer insight into how humans might adapt to climate change.

WHY THE MYTH PERSISTS

It's been 26 years since Mackowiak's study, yet the newer data has not taken hold among medical professionals or the public. What gives?

Mackowiak tells Mental Floss that he finds it a bit mystifying that the myth persists, especially since many people, when pressed, know that the so-called "average" temperature varies. Part of the problem may be psychological: We cling to beliefs despite evidence to the contrary—a phenomenon called belief perseverance [PDF]. It's a significant force upholding a surprising number of medical myths. The idea humans should drink eight glasses of water a day? Not science. Sugar causes hyperactive behavior? Nope. Reading in dim light harms eyesight? Not really.

Unlearning persistent myths—especially ones loaded with the weight of medical authority—is difficult. "Deep down, under it all," Mackowiak says, "people want simple answers for things."

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios