The Men Who Volunteered to Be Poisoned by the Government

Harvey Washington Wiley, the brusque and determined leader of the Department of Agriculture's Bureau of Chemistry in Washington, D.C., had good news and bad news for the 12 young men who had answered his call for volunteers. First, Wiley promised them three ample, freshly prepared meals every day for at least six months. Since the majority of the men were Department clerks living on modest wages, this was a tempting offer. The volunteers would also be under exceptional medical care, with weekly physicals and daily recordings of their weight, temperature, and pulse rate.

This was, Wiley explained, because he’d be slowly poisoning them.

Wiley’s staff would put borax in their butter, milk, or coffee. Formaldehyde would lurk in their meats, copper sulfate and saltpeter in their fruit pies. Wiley would begin at low doses and then ratchet up the amount until one or more of the men complained of debilitating symptoms, like vomiting or dizziness. Those people would then be excused from the program until they felt well enough to resume. In the event a subject died or became seriously ill, he would waive the right to pursue legal remedy against the government.

The year was 1902. With funding and consent from Congress, Wiley was about to embark on an experiment he dubbed the “hygienic table trials,” but it was the Washington news media that came up with the nickname that would stick: They called his volunteers "the Poison Squad."

The Poison Squad dining area. Image credit: FDA History Office [PDF] // Public Domain

At the turn of the last century, food manufacturers and distributors were untouched by government oversight. There were no federal requirements for labeling, which meant ingredients didn't need to be listed, and there were no explicit consequences for tampering or adulterating consumer goods. Parents would unwittingly give their babies cough syrup containing morphine to calm them down. Olive oil might actually be cottonseed oil, which was cheaper for makers to source; glucose could be passed off as honey.

A former professor of chemistry at Purdue University, Wiley was aghast at the freewheeling nature of the food industry. He was especially concerned with the use of preservatives, intended to ward off spoilage but poorly understood when consumed in consistent amounts over time. Taking a post as chief chemist at the Department of Agriculture in 1883, Wiley repeatedly petitioned for money and resources to quantify how these substances impacted the human body. Time and again, food lobbyists would thwart his attempts.

In 1902, Congress finally agreed to Wiley’s persistent requests, offering him $5000 to subsidize an experiment on the effects of food additives with a group of men who would spend at least six months, and eventually up to a year, in his service. In the basement of the Bureau’s Washington office, Wiley set up a kitchen, dining room, and lab; he installed a chef, known only as “Perry,” to prepare a variety of welcoming dishes for his volunteers. Roast chicken and braised beef would be served alongside borax and formaldehyde.

Although the ethics of the study could be debated both then and now, Wiley disclosed his intentions to the 12 men who signed up for the program. Mostly young, they were selected for having durable constitutions that might more easily withstand the accumulation of foreign chemicals. Wiley believed if the dosages bothered them, then children and older members of the public were in even more danger.

In exchange for free food and the sense of contributing to the betterment of society, the volunteers agreed to eat their three daily meals only in the test kitchen. No snacking between meals would be permitted, and only water could be ingested away from the table. Their weight, pulse, and temperature would be recorded before sitting down. Wiley also had each man carry a satchel with them at all times to collect urine and feces for laboratory analysis. “Every particle of their secreta,” Wiley said, was necessary to the trial.

The first treat was borax, a ground mineral commonly used to preserve meats and other perishables. Wiley allowed the men a period of 10 to 20 days of eating normally to establish baseline readings of their health and symptoms before Chef Perry began adding a half-gram of the powder to their butter. Although the men knew borax would be served, they didn’t know how—yet most all of them quickly began avoiding the butter out of instinct once they had gotten a taste of it.

Wiley next tried slipping it into their milk, but the same thing happened: They stopped drinking the milk. Having failed to account for the body’s natural resistance to being contaminated with the metallic-tasting substance, he began offering borax-filled capsules with each meal. The men dutifully swallowed them as a kind of dessert following the main course.

Wiley’s squad tolerated the borax—7.5 grains daily—for several weeks. But after a few months, headaches, stomach aches, and depression began to materialize. At six months, they threatened to go on strike unless the slow drip of poison stopped. The summer months seemed to exacerbate their ailments.

By then, Wiley had gotten enough data on borax. He moved on to salicylic acid, sulfuric acid, sodium benzoate, and other additives, administering each one at a time, all across the menu, to assess the response. Sometimes, the progression was so uneventful that the men took it upon themselves to liven up the proceedings. One laced a colleague’s drink with quinine, which can cause headaches and profuse sweating. Not long after, the man went out on a date; he later recounted that when he began to feel the symptoms of the quinine, he "went home prepared to die in the interest of science." (He was fine.)

Other times, the experiments were as dangerous as advertised. Owing to excruciating symptoms, the trial with formaldehyde was terminated early.

A sign posted in the Poison Squad's dining room. Image credit: FDA via Flickr // U.S. Government Works

Rotating members of the Poison Squad convened for roughly five years between 1902 and 1907. All along, lobbyists fought to suppress Wiley’s findings. His 477-page report on the effects of borax was well-received, but supervisors—and even the Secretary of Agriculture—tried to stifle his review of benzoic acid, a widely used preservative, due to its damaging findings and subsequent pestering by food lobbyists. The report was leaked only when the Secretary was away on vacation and a staffer misunderstood his instructions, ordering it printed by mistake.

In 1906, Congress passed both the Pure Food and Drug Act and the Meat Inspection Act, both designed to restrict the kinds of preservatives and additives used by food companies. The former was known as the “Wiley Act,” because Wiley had been the one to demonstrate the need for its inception. They were the first federal laws to regulate food. By the 1930s, Wiley's Bureau of Chemistry had morphed into the Food and Drug Administration—and almost all of the additives Wiley trialed had been excised from the commercial food industry.

Wiley himself remained with the Department of Agriculture until 1912, when he began a 19-year position as a consumer advocate for Good Housekeeping magazine. The public, which had come to know Wiley through the extensive media coverage of the Poison Squad, looked upon him as a reliable source for information.

In 1927, Wiley used his position to notify readers of a toxic substance that was widespread, commonly absorbed, and had underestimated potential to cause cancer. The American public, he warned, should be very wary of tobacco. While Good Housekeeping stopped accepting cigarette ads in 1952, the Surgeon General didn't issue a formal warning until 1964.

Meanwhile, the dozens of men who consented to the regulated poisonings were said to have suffered no lasting effects, save perhaps for one. In 1906, the family of poison squad member Robert Vance Freeman used the press to blame the man’s tuberculosis and subsequent death on the borax he was made to consume. Although Wiley had discharged Freeman in 1903 because his symptoms had rendered him “disabled,” he dismissed any idea the borax was at fault in his death. No charges or lawsuit were ever filed.

Although an experiment involving purposeful and deliberate doses of poison could never be described as "safe," Freeman's fate was an anomaly. Wiley made certain to limit a volunteer's service to one 12-month term, with the chemist correctly observing that “one year of this kind of life is as much as a young man wants.”

Additional Sources: "The Poison Squad and the Advent of Food and Drug Regulation" [PDF]

5 Signs Humans Are Still Evolving

Lealisa Westerhoff, AFP/Getty Images
Lealisa Westerhoff, AFP/Getty Images

When we think of human evolution, our minds wander back to the millions of years it took natural selection to produce modern-day man. Recent research suggests that, despite modern technology and industrialization, humans continue to evolve. "It is a common misunderstanding that evolution took place a long time ago, and that to understand ourselves we must look back to the hunter-gatherer days of humans," Dr. Virpi Lummaa, a professor at the University of Turku, told Gizmodo.

But not only are we still evolving, we're doing so even faster than before. In the last 10,000 years, the pace of our evolution has sped up, creating more mutations in our genes, and more natural selections from those mutations. Here are some clues that show humans are continuing to evolve.

1. Humans drink milk.

Historically, the gene that regulated humans' ability to digest lactose shut down as we were weaned off our mothers' breast milk. But when we began domesticating cows, sheep, and goats, being able to drink milk became a nutritionally advantageous quality, and people with the genetic mutation that allowed them to digest lactose were better able to propagate their genes.

The gene was first identified in 2002 in a population of northern Europeans that lived between 6000 and 5000 years ago. The genetic mutation for digesting milk is now carried by more than 95 percent of northern European descendants. In addition, a 2006 study suggests this tolerance for lactose developed again, independently of the European population, 3000 years ago in East Africa.

2. We're losing our wisdom teeth.

Our ancestors had much bigger jaws than we do, which helped them chew a tough diet of roots, nuts, and leaves. And what meat they ate they tore apart with their teeth, all of which led to worn-down chompers that needed replacing. Enter the wisdom teeth: A third set of molars is believed to be the evolutionary answer to accommodate our ancestors' eating habits.

Today, we have utensils to cut our food. Our meals are softer and easier to chew, and our jaws are much smaller, which is why wisdom teeth are often impacted when they come in — there just isn't room for them. Unlike the appendix, wisdom teeth have become vestigial organs. One estimate says 35 percent of the population is born without wisdom teeth, and some say they may disappear altogether.

3. We're resisting infectious diseases.

In 2007, a group of researchers looking for signs of recent evolution identified 1800 genes that have only become prevalent in humans in the last 40,000 years, many of which are devoted to fighting infectious diseases like malaria. More than a dozen new genetic variants for fighting malaria are spreading rapidly among Africans. Another study found that natural selection has favored city-dwellers. Living in cities has produced a genetic variant that allows us to be more resistant to diseases like tuberculosis and leprosy. "This seems to be an elegant example of evolution in action," says Dr. Ian Barnes, an evolutionary biologist at London's Natural History Museum, said in 2010 statement. "It flags up the importance of a very recent aspect of our evolution as a species, the development of cities as a selective force."

4. Our brains are shrinking.

While we may like to believe our big brains make us smarter than the rest of the animal world, our brains have actually been shrinking over the last 30,000 years. The average volume of the human brain has decreased from 1500 cubic centimeters to 1350 cubic centimeters, which is an amount equivalent to the size of a tennis ball.

There are several different conclusions as to why this is: One group of researchers suspects our shrinking brains mean we are in fact getting dumber. Historically, brain size decreased as societies became larger and more complex, suggesting that the safety net of modern society negated the correlation between intelligence and survival. But another, more encouraging theory says our brains are shrinking not because we're getting dumber, but because smaller brains are more efficient. This theory suggests that, as they shrink, our brains are being rewired to work faster but take up less room. There's also a theory that smaller brains are an evolutionary advantage because they make us less aggressive beings, allowing us to work together to solve problems, rather than tear each other to shreds.

5. Some of us have blue eyes.

Originally, we all had brown eyes. But about 10,000 years ago, someone who lived near the Black Sea developed a genetic mutation that turned brown eyes blue. While the reason blue eyes have persisted remains a bit of a mystery, one theory is that they act as a sort of paternity test. “There is strong evolutionary pressure for a man not to invest his paternal resources in another man’s child,” Bruno Laeng, lead author of a 2006 study on the development of blue eyes, told The New York Times. Because it is virtually impossible for two blue-eyed mates to create a brown-eyed baby, our blue-eyed male ancestors may have sought out blue-eyed mates as a way of ensuring fidelity. This would partially explain why, in a recent study, blue-eyed men rated blue-eyed women as more attractive compared to brown-eyed women, whereas females and brown-eyed men expressed no preference.

Now Ear This: A New App Can Detect a Child's Ear Infection

iStock.com/Techin24
iStock.com/Techin24

Generally speaking, using an internet connection to diagnose a medical condition is rarely recommended. But technology is getting better at outpacing skepticism over handheld devices guiding decisions and suggesting treatment relating to health care. The most recent example is an app that promises to identify one of the key symptoms of ear infections in kids.

The Associated Press reports that researchers at the University of Washington are close to finalizing an app that would allow a parent to assess whether or not their child has an ear infection using their phone, some paper, and some soft noises. A small piece of paper is folded into a funnel shape and inserted into the ear canal to focus the app's sounds (which resemble bird chirps) toward the child’s ear. The app measures sound waves bouncing off the eardrum. If pus or fluid is present, the sound waves will be altered, indicating a possible infection. The parent would then receive a text from the app notifying them of the presence of buildup in the middle ear.

The University of Washington tested the efficacy of the app by evaluating roughly 50 patients scheduled to undergo ear surgery at Seattle Children’s Hospital. The app was able to identify fluid in patients' ears about 85 percent of the time. That’s roughly as well as traditional exams, which involve visual identification as well as specialized acoustic devices.

While the system looks promising, not all cases of fluid in the ear are the result of infections or require medical attention. Parents would need to evaluate other symptoms, such as fever, if they intend to use the app to decide whether or not to seek medical attention. It may prove most beneficial in children with persistent fluid accumulation, a condition that needs to be monitored over the course of months when deciding whether a drain tube needs to be placed. Checking for fluid at home would save both time and money compared to repeated visits to a physician.

The app does not yet have Food and Drug Administration (FDA) approval and there is no timetable for when it might be commercially available. If it passes muster, it would join a number of FDA-approved “smart” medical diagnostic tools, including the AliveKor CardiaBand for the Apple Watch, which conducts EKG monitoring for heart irregularities.

[h/t WGRZ]

SECTIONS

arrow
LIVE SMARTER