8 Brilliant Scientific Screw-ups

iStock
iStock

By Eric Elfman

Hard work and dedication have their time and place, but the values of failure and ineptitude have gone unappreciated for far too long. They say that patience is a virtue, but the following eight inventions prove that laziness, slovenliness, clumsiness and pure stupidity can be virtues, too.

1. Anesthesia (1844)

Mistake Leading to Discovery: Recreational drug use
Lesson Learned: Too much of a good thing can sometimes be, well, a good thing

Nitrous oxide was discovered in 1772, but for decades the gas was considered no more than a party toy. People knew that inhaling a little of it would make you laugh (hence the name "laughing gas"), and that inhaling a little more of it would knock you unconscious. But for some reason, it hadn't occurred to anyone that such a property might be useful in, say, surgical operations.

Finally, in 1844, a dentist in Hartford, Conn., named Horace Wells came upon the idea after witnessing a nitrous mishap at a party. High on the gas, a friend of Wells fell and suffered a deep gash in his leg, but he didn't feel a thing. In fact, he didn't know he'd been seriously injured until someone pointed out the blood pooling at his feet.

To test his theory, Wells arranged an experiment with himself as the guinea pig. He knocked himself out by inhaling a large does of nitrous oxide, and then had a dentist extract a rotten tooth from his mouth. When Wells came to, his tooth had been pulled painlessly.

To share his discovery with the scientific world, he arranged to perform a similar demonstration with a willing patient in the amphitheatre of the Massachusetts General Hospital. But things didn't exactly go as planned. Not yet knowing enough about the time it took for the gas to kick in, Wells pulled out the man's tooth a little prematurely, and the patient screamed in pain. Wells was disgraced and soon left the profession. Later, after being jailed while high on chloroform, he committed suicide. It wasn't until 1864 that the American Dental Association formally recognized him for his discovery.

2. Iodine (1811)

Mistake Leading to Discovery: Industrial accident
Lesson Learned: Seaweed is worth its weight in salt

In the early 19th century, Bernard Courtois was the toast of Paris. He had a factory that produced saltpeter (potassium nitrate), which was a key ingredient in ammunition, and thus a hot commodity in Napoleon's France. On top of that, Courtois had figured out how to fatten his profits and get his saltpeter potassium for next to nothing. He simply took it straight from the seaweed that washed up daily on the shores. All he had to do was collect it, burn it, and extract the potassium from the ashes.

One day, while his workers were cleaning the tanks used for extracting potassium, they accidentally used a stronger acid than usual. Before they could say "sacre bleu!," mysterious clouds billowed from the tank. When the smoke cleared, Courtois noticed dark crystals on all the surfaces that had come into contact with the fumes. When he had them analyzed, they turned out to be a previously unknown element, which he named iodine, after the Greek word for "violet." Iodine, plentiful in saltwater, is concentrated in seaweed. It was soon discovered that goiters, enlargements of the thyroid gland, were caused by a lack of iodine in the diet. So, in addition to its other uses, iodine is now routinely added to table salt.

3. Penicillin (1928)

Mistake Leading to Discovery: Living like a pig
Lesson Learned: It helps to gripe to your friends about your job

Scottish scientist Alexander Fleming had a, shall we say, relaxed attitude toward a clean working environment. His desk was often littered with small glass dishes—a fact that is fairly alarming considering that they were filled with bacteria cultures scraped from boils, abscesses and infections. Fleming allowed the cultures to sit around for weeks, hoping something interesting would turn up, or perhaps that someone else would clear them away.

Finally one day, Fleming decided to clean the bacteria-filled dishes and dumped them into a tub of disinfectant. His discovery was about to be washed away when a friend happened to drop by the lab to chat with the scientist. During their discussion, Fleming griped good-naturedly about all the work he had to do and dramatized the point by grabbing the top dish in the tub, which was (fortunately) still above the surface of the water and cleaning agent. As he did, Fleming suddenly noticed a dab of fungus on one side of the dish, which had killed the bacteria nearby. The fungus turned out to be a rare strain of penicillium that had drifted onto the dish from an open window.

Fleming began testing the fungus and found that it killed deadly bacteria, yet was harmless to human tissue. However, Fleming was unable to produce it in any significant quantity and didn't believe it would be effective in treating disease. Consequently, he downplayed its potential in a paper he presented to the scientific community. Penicillin might have ended there as little more than a medical footnote, but luckily, a decade later, another team of scientists followed up on Fleming's lead. Using more sophisticated techniques, they were able to successfully produce one of the most life-saving drugs in modern medicine.

4. The Telephone (1876)

Mistake Leading to Discovery: Poor foreign language skills
Lesson Learned: A little German is better than none

In the 1870s, engineers were working to find a way to send multiple messages over one telegraph wire at the same time. Intrigued by the challenge, Alexander Graham Bell began experimenting with possible solutions. After reading a book by Hermann Von Helmholtz, Bell got the idea to send sounds simultaneously over a wire instead. But as it turns out, Bell's German was a little rusty, and the author had mentioned nothing about the transmission of sound via wire. Too late for Bell though; the inspiration was there, and he had already set out to do it.

The task proved much more difficult than Bell had imagined. He and his mechanic, Thomas Watson, struggled to build a device that could transmit sound. They finally succeeded, however, and came up with the telephone.

5. Photography (1835)

Mistake Leading to Discovery: Not doing the dishes
Lesson Learned: Put off today what you can do tomorrow

Between 1829 and 1835, Louis Jacques Mandé Daguerre was close to becoming the first person to develop a practical process for producing photographs. But he wasn't home yet.

Daguerre had figured out how to expose an image onto highly polished plates covered with silver iodide, a substance known to be sensitive to light. However, the images he was producing on these polished plates were barely visible, and he didn't know how to make them darker.

After producing yet another disappointing image one day, Daguerre tossed the silverized plate in his chemical cabinet, intending to clean it off later. But when he went back a few days later, the image had darkened to the point where it was perfectly visible. Daguerre realized that one of the chemicals in the cabinet had somehow reacted with the silver iodide, but he had no way of know which one it was, and there were a whole lot of chemicals in that cabinet.

For weeks, Daguerre took one chemical out of the cabinet every day and put in a newly exposed plate. But every day, he found a less-than-satisfactory image. Finally, as he was testing the very last chemical, he got the idea to put the plate in the now-empty cabinet, as he had done the first time. Sure enough, the image on the plate darkened. Daguerre carefully examined the shelves of the cabinet and found what he was looking for. Weeks earlier, a thermometer in the cabinet had broken, and Daguerre (being the slob that he was) didn't clean up the mess very well, leaving a few drops of mercury on the shelf. Turns out, it was the mercury vapor interacting with the silver iodide that produced the darker image. Daguerre incorporated mercury vapor into his process, and the Daguerreotype photograph was born.

6. Mauve Dye (1856)

Mistake Leading to Discovery: Delusions of grandeur
Lesson Learned: Real men wear mauve

In 1856, an 18-year-old British chemistry student named William Perkin attempted to develop a synthetic version of quinine, the drug commonly used to treat malaria. It was a noble cause, but the problem was, he had no idea what he was doing.

Perkin started by mixing aniline (a colorless, oily liquid derived from coal-tar, a waste product of the steel industry) with propylene gas and potassium dichromate. It's a wonder he didn't blow himself to bits, but the result was just a disappointing black mass stuck to the bottom of his flask. As Perkin started to wash out the container, he noticed that the black substance turned the water purple, and after playing with it some more, he discovered that the purple liquid could be used to dye cloth.

With financial backing from his wealthy father, Perkin began a dye-making business, and his synthetic mauve colorant soon became popular. Up until the time of Perkin's discovery, natural purple dye had to be extracted from Mediterranean mollusks, making it extremely expensive. Perkin's cheap coloring not only jumpstarted the synthetic dye industry (and gave birth to the colors used in J.Crew catalogs), it also sparked the growth of the entire field of organic chemistry.

7. Nylon (1934)

Mistake Leading to Discovery: Workplace procrastination
Lesson Learned: When the cat's away, the mice should play

In 1934, researchers at DuPont were charged with developing synthetic silk. But after months of hard work, they still hadn't found what they were looking for, and the head of the project, Wallace Hume Carothers, was considering calling it quits. The closest they had come was creating a liquid polymer that seemed chemically similar to silk, but in its liquid form wasn't very useful. Deterred, the researchers began testing other, seemingly more promising substances called polyesters.

One day, a young (and apparently bored) scientist in the group noticed that if he gathered a small glob of polyester on a glass stirring rod, he could use it to pull thin strands of the material from the beaker. And for some reason (prolonged exposure to polyester fumes, perhaps?) he found this hilarious. So on a day when boss-man Carothers was out of the lab, the young researcher and his co-workers started horsing around and decided to have a competition to see who could draw the longest threads from the beaker. As they raced down the hallway with the stirring rods, it dawned on them: By stretching the substance into strands, they were actually re-orienting the molecules and making the liquid material solid.

Ultimately, they determined that the polyesters they were playing with couldn't be used in textiles, like DuPont wanted, so they turned to their previously unsuccessful silk-like polymer. Unlike the polyester, it could be drawn into solid strands that were strong enough to be woven. This was the first completely synthetic fiber, and they named the material Nylon.

8. Vulcanized Rubber (1844)

Mistake Leading to Discovery: Obsession combined with butterfingers
Lesson Learned: A little clumsiness can go a long way

In the early 19th century, natural rubber was relatively useless. It melted in hot weather and became brittle in the cold. Plenty of people had tried to "cure" rubber so it would be impervious to temperature changes, but no one had succeeded "– that is, until Charles Goodyear stepped in (or so he claims). According to his own version of the tale, the struggling businessman became obsessed with solving the riddle of rubber, and began mixing rubber with sulfur over a stove. One day, he accidentally spilled some of the mixture onto the hot surface, and when it charred like a piece of leather instead of melting, he knew he was onto something.

The truth, according to well-documented sources, is somewhat different. Apparently, Goodyear learned the secret of combining rubber and sulfur from another early experimenter. And it was one of his partners who accidentally dropped a piece of fabric impregnated with the rubber and sulfur mixture onto a hot stove. But it was Goodyear who recognized the significance of what happened, and he spent months trying to find the perfect combination of rubber, sulfur and high heat. (Goodyear also took credit for coining the term "vulcanization" for the process, but the word was actually first used by an English competitor.) Goodyear received a patent for the process in 1844, but spent the rest of his life defending his right to the discovery. Consequently, he never grew rich and, in fact, wound up in debtors prison more than once. Ironically, rubber became a hugely profitable industry years later, with the Goodyear Tire & Rubber Co. at the forefront.

This article originally appeared in a 2009 issue of mental_floss magazine.

5 Signs Humans Are Still Evolving

Lealisa Westerhoff, AFP/Getty Images
Lealisa Westerhoff, AFP/Getty Images

When we think of human evolution, our minds wander back to the millions of years it took natural selection to produce modern-day man. Recent research suggests that, despite modern technology and industrialization, humans continue to evolve. "It is a common misunderstanding that evolution took place a long time ago, and that to understand ourselves we must look back to the hunter-gatherer days of humans," Dr. Virpi Lummaa, a professor at the University of Turku, told Gizmodo.

But not only are we still evolving, we're doing so even faster than before. In the last 10,000 years, the pace of our evolution has sped up, creating more mutations in our genes, and more natural selections from those mutations. Here are some clues that show humans are continuing to evolve.

1. Humans drink milk.

Historically, the gene that regulated humans' ability to digest lactose shut down as we were weaned off our mothers' breast milk. But when we began domesticating cows, sheep, and goats, being able to drink milk became a nutritionally advantageous quality, and people with the genetic mutation that allowed them to digest lactose were better able to propagate their genes.

The gene was first identified in 2002 in a population of northern Europeans that lived between 6000 and 5000 years ago. The genetic mutation for digesting milk is now carried by more than 95 percent of northern European descendants. In addition, a 2006 study suggests this tolerance for lactose developed again, independently of the European population, 3000 years ago in East Africa.

2. We're losing our wisdom teeth.

Our ancestors had much bigger jaws than we do, which helped them chew a tough diet of roots, nuts, and leaves. And what meat they ate they tore apart with their teeth, all of which led to worn-down chompers that needed replacing. Enter the wisdom teeth: A third set of molars is believed to be the evolutionary answer to accommodate our ancestors' eating habits.

Today, we have utensils to cut our food. Our meals are softer and easier to chew, and our jaws are much smaller, which is why wisdom teeth are often impacted when they come in — there just isn't room for them. Unlike the appendix, wisdom teeth have become vestigial organs. One estimate says 35 percent of the population is born without wisdom teeth, and some say they may disappear altogether.

3. We're resisting infectious diseases.

In 2007, a group of researchers looking for signs of recent evolution identified 1800 genes that have only become prevalent in humans in the last 40,000 years, many of which are devoted to fighting infectious diseases like malaria. More than a dozen new genetic variants for fighting malaria are spreading rapidly among Africans. Another study found that natural selection has favored city-dwellers. Living in cities has produced a genetic variant that allows us to be more resistant to diseases like tuberculosis and leprosy. "This seems to be an elegant example of evolution in action," says Dr. Ian Barnes, an evolutionary biologist at London's Natural History Museum, said in 2010 statement. "It flags up the importance of a very recent aspect of our evolution as a species, the development of cities as a selective force."

4. Our brains are shrinking.

While we may like to believe our big brains make us smarter than the rest of the animal world, our brains have actually been shrinking over the last 30,000 years. The average volume of the human brain has decreased from 1500 cubic centimeters to 1350 cubic centimeters, which is an amount equivalent to the size of a tennis ball.

There are several different conclusions as to why this is: One group of researchers suspects our shrinking brains mean we are in fact getting dumber. Historically, brain size decreased as societies became larger and more complex, suggesting that the safety net of modern society negated the correlation between intelligence and survival. But another, more encouraging theory says our brains are shrinking not because we're getting dumber, but because smaller brains are more efficient. This theory suggests that, as they shrink, our brains are being rewired to work faster but take up less room. There's also a theory that smaller brains are an evolutionary advantage because they make us less aggressive beings, allowing us to work together to solve problems, rather than tear each other to shreds.

5. Some of us have blue eyes.

Originally, we all had brown eyes. But about 10,000 years ago, someone who lived near the Black Sea developed a genetic mutation that turned brown eyes blue. While the reason blue eyes have persisted remains a bit of a mystery, one theory is that they act as a sort of paternity test. “There is strong evolutionary pressure for a man not to invest his paternal resources in another man’s child,” Bruno Laeng, lead author of a 2006 study on the development of blue eyes, told The New York Times. Because it is virtually impossible for two blue-eyed mates to create a brown-eyed baby, our blue-eyed male ancestors may have sought out blue-eyed mates as a way of ensuring fidelity. This would partially explain why, in a recent study, blue-eyed men rated blue-eyed women as more attractive compared to brown-eyed women, whereas females and brown-eyed men expressed no preference.

Now Ear This: A New App Can Detect a Child's Ear Infection

iStock.com/Techin24
iStock.com/Techin24

Generally speaking, using an internet connection to diagnose a medical condition is rarely recommended. But technology is getting better at outpacing skepticism over handheld devices guiding decisions and suggesting treatment relating to health care. The most recent example is an app that promises to identify one of the key symptoms of ear infections in kids.

The Associated Press reports that researchers at the University of Washington are close to finalizing an app that would allow a parent to assess whether or not their child has an ear infection using their phone, some paper, and some soft noises. A small piece of paper is folded into a funnel shape and inserted into the ear canal to focus the app's sounds (which resemble bird chirps) toward the child’s ear. The app measures sound waves bouncing off the eardrum. If pus or fluid is present, the sound waves will be altered, indicating a possible infection. The parent would then receive a text from the app notifying them of the presence of buildup in the middle ear.

The University of Washington tested the efficacy of the app by evaluating roughly 50 patients scheduled to undergo ear surgery at Seattle Children’s Hospital. The app was able to identify fluid in patients' ears about 85 percent of the time. That’s roughly as well as traditional exams, which involve visual identification as well as specialized acoustic devices.

While the system looks promising, not all cases of fluid in the ear are the result of infections or require medical attention. Parents would need to evaluate other symptoms, such as fever, if they intend to use the app to decide whether or not to seek medical attention. It may prove most beneficial in children with persistent fluid accumulation, a condition that needs to be monitored over the course of months when deciding whether a drain tube needs to be placed. Checking for fluid at home would save both time and money compared to repeated visits to a physician.

The app does not yet have Food and Drug Administration (FDA) approval and there is no timetable for when it might be commercially available. If it passes muster, it would join a number of FDA-approved “smart” medical diagnostic tools, including the AliveKor CardiaBand for the Apple Watch, which conducts EKG monitoring for heart irregularities.

[h/t WGRZ]

SECTIONS

arrow
LIVE SMARTER