CLOSE
Original image

6 Remarkable Medical Gadgets

Original image

Martha Mason of Lattimore, North Carolina, recently passed away at the age of 71. What makes her obituary different than the thousands of others that appear in newspapers each day? It's the fact that she spent 60 of those 71 years in an iron lung, after a 1948 polio attack left her paralyzed from the neck down. Mason, who graduated from Wake Forest University in 1960, used a voice-recognition computer to chronicle her life story in the 1994 autobiography Breath: Life in the Rhythm of an Iron Lung. Technology gave her the option to use a portable ventilator many years ago, but Mason preferred the protection of the metal cylinder that had been home to her for so many years. She didn't like the idea of tubes in her throat, incisions into her body, or the frequent hospital visits that would accompany the "improvement." mental_floss invites you to peek into the history of the iron lung and five other medical gadgets and gizmos which have aided both doctors and patients over the last century.

1. The Iron Lung

Dr. Philip Drinker of the Harvard School of Public Health developed the first "thoracic cage" that used vacuum cleaner blowers to alternate between atmospheric and sub-atmospheric pressure to force a patient to breathe. The machine, known as a Drinker Respirator, was originally intended as a pediatric-ward device to assist premature babies born with under-developed lungs. But when the dreaded disease known as polio began to spread in the United States, doctors found a second use for the device. Polio frequently paralyzed patients' diaphragms, rendering them unable to breathe on their own. The Drinker Respirator was first used on a polio patient in 1928. Following its initial success, and with the disease affecting tens of thousands of Americans, demand quickly grew. The Warren Collins Corporation fine-tuned Drinker's design and mass-produced a similar device at a more affordable price; it was dubbed the Iron Lung. Cost and availability became pertinent factors in the early 1950s, when every American neighborhood seemed to have at least one polio patient in residence.

2. The Stethoscope

medinv2.jpgAs a young medical-school student in 19th-century Paris, Rene Theophile Hyacinthe Laennec developed a knack for hearing and interpreting the different sounds made by the heart and lungs when he placed his ear on patients' chests. This method only worked if the patient was sufficiently slender, of course. One afternoon, Laennec saw some children playing with wooden boards. One tyke would scratch or tap softly on one end, while another put his ear on the other end of the board to hear the sound. Laennec went back to his office - presumably after removing a splinter from the tyke's ear - and constructed a long tube out of several pieces of rolled-up paper. By placing the end of the cylinder directly on a patient's chest or back, he discovered that he could hear sounds much more clearly than before. After experimenting with different materials and designs, he came up with the stethoscope. In 1819, the medical community began to recognize the use of the gadget as a valuable diagnostic tool.

3. The Blood Pressure Cuff

medinv3.jpgHuman blood pressure was first recorded in 1847 by Dr. Carl Ludwig. Unfortunately, his method required the insertion of a catheter into an artery; not the most convenient procedure. Eight years later, Karl Vierordt discovered that the arterial pulse could be measured non-invasively by wrapping an inflatable cuff tightly around the upper arm and slowly releasing the pressure. The device was subject to regular improvements over the years, and in 1896, Scipione Riva-Rocci devised the first modern sphygmomanometer. He attached the inflatable cuff to a mercury-filled manometer (a device that measures liquid pressure), which provided an accurate account of the force of the blood as the heart tried to pump it past the restricting cuff and into the arm.

4. The Internal Thermometer

medinv4.jpgDaniel Gabriel Fahrenheit developed the first mercury thermometer back in 1720. Before his invention, thermometers relied on a mixture of alcohol and water. Unfortunately, these were too susceptible to air pressure to be of much use. Fahrenheit discovered that not only did mercury expand at a more constant rate than alcohol (providing more accurate results), but it also allowed for readings at much higher and lower temperature extremes. When first used for medical purposes, the typical thermometer was over a foot long and had to be held in place for 20 minutes to accurately determine a patient's temperature. In 1866, British physician Sir Thomas Allbut invented a six-inch bulb thermometer that could record a temperature in only five minutes.

5. The X-Ray Machine

medinv5.jpgGerman physics professor Wilhelm Conrad Roentgen was experimenting with cathode rays in his laboratory in November 1895 when he noticed that certain objects in the room began to glow. The humble scientist wasn't quite sure what his findings meant, and his only comment at the time was "I have discovered something interesting, but I do not know whether or not my observations are correct." Roentgen continued his experiments, and a month later, he presented an X-ray of his wife's hand to the Wurzburg Physical-Medical Society. (He'd named his new technology with an X, a variable scientists use to represent an unknown factor.) Roentgen won a Nobel Prize for his discovery, and "X-ray-mania" became a fad, Doctors and scientists joined in to take endless "pictures" of human bone structure. Department stores even took X-rays of customers' feet to fit them with the best possible shoes. The dangers of the technology weren't discovered and addressed until the one-two punch of serious X-ray burns and widespread cancer began to affect Thomas Edison's assistant, Clarence Dally.

6. The Pacemaker

medinv6.jpgToronto surgeon Dr. Wilfred Bigelow spent years conducting extensive studies on the treatment of frostbite. In 1949, using techniques he had culled from his research, Bigelow demonstrated that "controlled hypothermia" could be used to slow down the rhythm of the human heart. This tactic would reduce blood flow in the human body, making certain procedures (like open-heart surgery) possible. The main problem with his technique was discovering a way to jump-start the heart if it slowed down too far or came to a complete stop. Luckily, doctor-cum-electrical-engineer John Hopps was in the midst of his own research, hoping to use radio frequencies to restore body temperature in hypothermia patients. During Hopps' experiments, he had discovered that the application of a gentle electrical charge could restart the heart without damaging its muscle tissue. Using Bigelow's technique to operate on the heart, in 1950, he implanted the first pacemaker into a human being.

Naturally, there are dozens of medical devices and procedures that we didn't cover in this article. Which ones have you always wondered about? Like who invented that torturous tongue depressor? Or the name of that shiny round thing that old-time TV doctors always wore on headbands? Or even why, despite a 1 p.m. appointment, you have to wait until 2:30 to see your GP? Please drop a comment, and perhaps we'll revisit this topic again. Thanks!

Original image
Jamie McCarthy/Getty Images for Bill & Melinda Gates Foundation
arrow
Medicine
Bill Gates is Spending $100 Million to Find a Cure for Alzheimer's
Original image
Jamie McCarthy/Getty Images for Bill & Melinda Gates Foundation

Not everyone who's blessed with a long life will remember it. Individuals who live into their mid-80s have a nearly 50 percent chance of developing Alzheimer's, and scientists still haven't discovered any groundbreaking treatments for the neurodegenerative disease [PDF]. To pave the way for a cure, Microsoft co-founder and philanthropist Bill Gates has announced that he's donating $100 million to dementia research, according to Newsweek.

On his blog, Gates explained that Alzheimer's disease places a financial burden on both families and healthcare systems alike. "This is something that governments all over the world need to be thinking about," he wrote, "including in low- and middle-income countries where life expectancies are catching up to the global average and the number of people with dementia is on the rise."

Gates's interest in Alzheimer's is both pragmatic and personal. "This is something I know a lot about, because men in my family have suffered from Alzheimer’s," he said. "I know how awful it is to watch people you love struggle as the disease robs them of their mental capacity, and there is nothing you can do about it. It feels a lot like you're experiencing a gradual death of the person that you knew."

Experts still haven't figured out quite what causes Alzheimer's, how it progresses, and why certain people are more prone to it than others. Gates believes that important breakthroughs will occur if scientists can understand the condition's etiology (or cause), create better drugs, develop techniques for early detection and diagnosis, and make it easier for patients to enroll in clinical trials, he said.

Gates plans to donate $50 million to the Dementia Discovery Fund, a venture capital fund that supports Alzheimer's research and treatment developments. The rest will go to research startups, Reuters reports.

[h/t Newsweek]

arrow
science
Eye Doctors Still Use This 100-Year-Old Test for Color Blindness

You may have seen them at your ophthalmologist's office: large circular diagrams made up of colored dots. People with normal vision are able to discern a number among the dots of contrasting colors. People who are color blind might see only a field of spots.

These elegant, deceptively modern drawings were published 100 years ago by a Japanese ophthalmologist, Shinobu Ishihara. Thanks to the designs' simplicity and diagnostic accuracy, the Ishihara test is still the most popular and efficient way to identify patients with color vision deficiencies.

Born in Tokyo in 1879, Ishihara studied medicine at the prestigious Tokyo Imperial University on a military scholarship, which required him to serve in the armed forces. After graduating in 1905, he worked for three years as a physician specializing in surgery in the Imperial Japanese Army, and then returned to the university for postgraduate studies in ophthalmology. In his research, Ishihara focused on identifying and recruiting soldiers with superior vision, thereby increasing the overall effectiveness of the military. And that became of prime importance to Japan beginning in 1914.

As World War I spread across Europe, Asia, and the Pacific, the Japanese army asked Ishihara to develop a better way to screen draftees for color vision problems. The most popular method at the time was the Stilling test, invented by German ophthalmologist Jakob Stilling in 1878 as the first clinical color vision test. (Previous tools had asked patients to identify the colors of wool skeins or illuminated lanterns—useful skills for sailors and railway conductors, but an imprecise method for diagnosing vision issues.)

"Though popular, 'the Stilling' retained a distinctly 19th-century flavor, more treatise-like and less diagnostically incisive," according to Eye magazine.


Shinobu Ishihara
Wellcome Images // CC BY 4.0

Japanese army officials requested a new diagnostic tool that was easier to administer and interpret. The test Ishihara began to develop was based, like Stilling's, on the principle of pseudo-isochromatism—a phenomenon in which two or more colors are seen as the same (or isochromatic) when they're actually different. A person with normal vision could easily see the difference, while people with red-green deficiency, the most common form of color blindness, would have difficulty distinguishing those two opposing colors. Those with blue-yellow color blindness, a less common type, would have a hard time discerning reds, greens, blues, or yellows.

Ishihara hand-painted circular designs comprised of small dots of different areas and colors so that variations in the design could be discerned only by color and not shape, size, or pattern. Hidden in the field of dots was a figure of a contrasting color that people with normal vision could see, while those with deficiencies could not. Other plates in the series were designed to show figures that would be visible only to people with deficiencies. When physicians displayed the diagrams, patients said or traced the visible figure within the circle without needing to use ambiguous color names, which standardized the possible results.

The earliest sets of Ishihara plates, produced in 1916, were reserved exclusively for the army's use and featured Japanese characters within the diagrams. In 1917, in an effort to sell the series internationally, Ishihara redesigned it with the now-familiar Arabic numerals and published a set of 16 plates as Tests for Colour Deficiency.

The tests were adopted throughout the world beginning in the early 1920s, and eventually grew into a set of 38 plates. But their popularity almost led to their undoing. Unauthorized publishers printed their own version of the plates to meet demand, throwing the accuracy of the diagnostic colors into doubt. "The plates have been duplicated along with an easily memorized key by cheap color processes in the tabloid press, and exposed in public places, reducing the fifth edition [of the collection] to a parlor game," one psychologist warned in the Journal of the Optical Society of America in 1943.

Despite those obstacles, the tests proved indispensable for both practicing physicians and researchers. Ishihara continued to refine the designs and improve the color accuracy of the images into the late 1950s, while he also served as the chair of the ophthalmology department and then dean of the medical school at Tokyo Imperial University. In addition to Tests for Colour Deficiency, he also published an atlas, textbook, lectures, and research studies on eye diseases. But he is remembered most for the iconic charts that seamlessly blend art and science.

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios