Ten Days in a Madhouse: The Woman Who Got Herself Committed

In 1887, intrepid reporter Nellie Bly pretended she was crazy and got herself committed, all to help improve conditions in a New York City mental institution.

“The insane asylum on Blackwell’s Island is a human rat-trap. It is easy to get in, but once there it is impossible to get out.”

Those words, describing New York City’s most notorious mental institution, were written by journalist Nellie Bly in 1887. It was no mere armchair observation, because Bly got herself committed to Blackwell’s and wrote a shocking exposé called Ten Days In A Madhouse. The series of articles became a best-selling book, launching Bly’s career as a world-famous investigative reporter and also helping bring reform to the asylum.

In the late 1880s, New York newspapers were full of chilling tales about brutality and patient abuse at the city’s various mental institutions. Into the fray came the plucky 23-year Nellie Bly (born Elizabeth Cochrane, she renamed herself after a popular Stephen Foster song). At a time when most female writers were confined to newspapers’ society pages, she was determined to play with the big boys. The editor at The World liked Bly’s moxie, and challenged her to come up with an outlandish stunt to attract readers and prove her mettle as a “detective reporter.”

The stylish and petite Bly, who had a perpetual smile, set about her crazy-eye makeover. She dressed in tattered second-hand clothes. She stopped bathing and brushing her teeth. And for hours, she practiced looking like a lunatic in front of the mirror. “Faraway expressions look crazy,” she wrote. Soon she was wandering the streets in a daze. Posing as Nellie Moreno, a Cuban immigrant, she checked herself into a temporary boarding house for women. Within twenty-four hours, her irrational, hostile rants had all of the other residents fearing for their lives. “It was the greatest night of my life,” Bly later wrote.

The police hauled Bly off, and within a matter of days, she bounced from court to Bellevue Hospital’s psychiatric ward. When she professed to not remembering how she ended up in New York, the chief doctor diagnosed her as “delusional and undoubtedly insane.” Meanwhile, several of the city’s other newspapers took an interest in what one called the “mysterious waif with the wild, hunted look in her eyes.” Bly had everyone hoodwinked, and soon enough, she was aboard the “filthy ferry” to Blackwell’s Island.

The Lonely Island

Opened as America’s first municipal mental hospital in 1839, Blackwell’s Island (known today as Roosevelt Island) was meant to be a state-of-the-art institution committed to moral, humane rehabilitation of its patients. But when funding got cut, the progressive plans went out the window. It ended up as a scary asylum, staffed in part by inmates of a nearby penitentiary.

Although other writers had reported on conditions at the asylum (notably Charles Dickens, in 1842, who described its “listless, madhouse air” as “very painful”), Bly was the first reporter to go undercover. What she found exceeded her worst expectations. There were “oblivious doctors” and “coarse, massive” orderlies who “choked, beat and harassed” patients, and “expectorated tobacco juice about on the floor in a manner more skillful than charming.” There were foreign women, completely sane, who were committed simply because they couldn’t make themselves understood. Add to that rancid food, dirty linens, no warm clothing and ice-cold baths that were like a precursor to water boarding. Bly described the latter:

“My teeth chattered and my limbs were goose-fleshed and blue with cold. Suddenly I got, one after the other, three buckets of water over my head – ice-cold water, too – into my eyes, my ears, my nose and my mouth. I think I experienced the sensation of a drowning person as they dragged me, gasping, shivering and quaking, from the tub. For once I did look insane.”

And worst of all, there was the endless, enforced isolation:

“What, excepting torture, would produce insanity quicker than this treatment? . . . Take a perfectly sane and healthy woman, shut her up and make her sit from 6 a.m. to 8 p.m. on straight-back benches, do not allow her to talk or move during these hours, give her no reading and let her know nothing of the world or its doings, give her bad food and harsh treatment, and see how long it will take to make her insane. Two months would make her a mental and physical wreck.”

As soon as Bly arrived at Blackwell’s Island, she dropped her crazy act. But to her horror, she found that only confirmed her diagnosis. “Strange to say, the more sanely I talked and acted, the crazier I was thought to be,” she wrote.

Near the end of her stay, her cover was almost blown. A fellow reporter she’d known for years was sent by another newspaper to write about the mysterious patient. He himself posed as a man in search of a lost loved one. Bly begged her friend not give her away. He didn’t. Finally, after ten days, The World sent an attorney to arrange for Nellie Moreno’s release.

Going Public

Two days later, the paper ran the first installment of Bly’s story, entitled “Behind Asylum Bars.” The psychiatric doctors who’d been fooled offered apologies, excuses and defenses. The story traveled across the country, with papers lauding Bly’s courageous achievement. Almost overnight, she became a star journalist.

But for Bly, it wasn’t about the fame. “I have one consolation for my work,” she wrote. “On the strength of my story, the committee of appropriation provides $1,000,000 more than was ever before given, for the benefit of the insane.”

Actually, the city had already been considering increasing the budget for asylums, but Bly’s article certainly pushed things along.

A month after her series ran, Bly returned to Blackwell’s with a grand jury panel. In her book, she says that when they made their tour, many of the abuses she reported had been corrected: the food services and sanitary conditions were improved, the foreign patients had been transferred, and the tyrannical nurses had disappeared. Her mission was accomplished.

Bly would go on to more sensational exploits, most notably, in 1889, circling the globe in a record-setting seventy-two days (she meant to beat out Jules Verne’s fictional trip in Around The World in Eighty Days). In later years, she retired from journalism and founded her own company, designing and marketing steel barrels used for milk cans and boilers. She died in 1922. Bly’s amazing life has since been the subject of a Broadway musical, a movie and a children’s book.

Why the Filet-O-Fish Sandwich Has Been on the McDonald's Menu for Nearly 60 Years

McDonald's has introduced and quietly killed many dishes over the years (remember McDonald's pizza?), but there's a core group of items that have held their spot on the menu for decades. Listed alongside the Big Mac and McNuggets is the Filet-O-Fish—a McDonald's staple you may have forgotten about if you're not the type of person who orders seafood from fast food restaurants. But the classic sandwich, consisting of a fried fish filet, tartar sauce, and American cheese on a bun, didn't get on the menu by mistake—and thanks to its popularity around Lent, it's likely to stick around.

According to Taste of Home, the inception of the Filet-O-Fish can be traced back to a McDonald's franchise that opened near Cincinnati, Ohio in 1959. Back then the restaurant offered beef burgers as its only main dish, and for most of the year, diners couldn't get enough of them. Things changed during Lent: Many Catholics abstain from eating meat and poultry on Fridays during the holy season as a form of fasting, and in the early 1960s, Cincinnati was more than 85 percent Catholic. Fridays are supposed to be one of the busiest days of the week for restaurants, but sales at the Ohio McDonald's took a nosedive every Friday leading up to Easter.

Franchise owner Lou Groen went to McDonald's founder Ray Kroc with the plan of adding a meat alternative to the menu to lure back Catholic customers. He proposed a fried halibut sandwich with tartar sauce (though meat is off-limits for Catholics on Fridays during Lent, seafood doesn't count as meat). Kroc didn't love the idea, citing his fears of stores smelling like fish, and suggested a "Hula Burger" made from a pineapple slice with cheese instead. To decide which item would earn a permanent place on the menu, they put the two sandwiches head to head at Groen's McDonald's one Friday during Lent.

The restaurant sold 350 Filet-O-Fish sandwiches that day—clearly beating the Hula Burger (though exactly how many pineapple burgers sold, Kroc wouldn't say). The basic recipe has received a few tweaks, switching from halibut to the cheaper cod and from cod to the more sustainable Alaskan pollock, but the Filet-O-Fish has remained part of the McDonald's lineup in some form ever since. Today 300 million of the sandwiches are sold annually, and about a quarter of those sales are made during Lent.

Other seafood products McDonald's has introduced haven't had the same staying power as the Filet-O-Fish. In 2013, the chain rolled out Fish McBites, a chickenless take on McNuggets, only to pull them from menus that same year.

[h/t Taste of Home]

The Disturbing Reason Schools Tattooed Their Students in the 1950s

Kurt Hutton, Hulton Archive/Getty Images
Kurt Hutton, Hulton Archive/Getty Images

When Paul Bailey was born at Beaver County Hospital in Milford, Utah on May 9, 1955, it took less than two hours for the staff to give him a tattoo. Located on his torso under his left arm, the tiny marking was rendered in indelible ink with a needle gun and indicated Bailey’s blood type: O-Positive.

“It is believed to be the youngest baby ever to have his blood type tattooed on his chest,” reported the Beaver County News, cooly referring to the infant as an “it.” A hospital employee was quick to note parental consent had been obtained first.

The permanent tattooing of a child who was only hours old was not met with any hysteria. Just the opposite: In parts of Utah and Indiana, local health officials had long been hard at work instituting a program that would facilitate potentially life-saving blood transfusions in the event of a nuclear attack. By branding children and adults alike with their blood type, donors could be immediately identified and used as “walking blood banks” for the critically injured.

Taken out of context, it seems unimaginable. But in the 1950s, when the Cold War was at its apex and atomic warfare appeared not only possible but likely, children willingly lined up at schools to perform their civic duty. They raised their arm, gritted their teeth, and held still while the tattoo needle began piercing their flesh.

 

The practice of subjecting children to tattoos for blood-typing has appropriately morbid roots. Testifying at the Nuremberg Tribunal on War Crimes in the 1940s, American Medical Association physician Andrew Ivy observed that members of the Nazi Waffen-SS carried body markings indicating their blood type [PDF]. When he returned to his hometown of Chicago, Ivy carried with him a solution for quickly identifying blood donors—a growing concern due to the outbreak of the Korean War in 1950. The conflict was depleting blood banks of inventory, and it was clear that reserves would be necessary.

School children sit next to one another circa the 1950s
Reg Speller, Fox Photos/Getty Images

If the Soviet Union targeted areas of the United States for destruction, it would be vital to have a protocol for blood transfusions to treat radiation poisoning. Matches would need to be found quickly. (Transfusions depend on matching blood to avoid the adverse reactions that come from mixing different types. When a person receives blood different from their own, the body will create antibodies to destroy the red blood cells.)

In 1950, the Department of Defense placed the American Red Cross in charge of blood donor banks for the armed forces. In 1952, the Red Cross was the coordinating agency [PDF] for obtaining blood from civilians for the National Blood Program, which was meant to replenish donor supply during wartime. Those were both measures for soldiers. Meanwhile, local medical societies were left to determine how best to prepare their civilian communities for a nuclear event and its aftermath.

As part of the Chicago Medical Civil Defense Committee, Ivy promoted the use of the tattoos, declaring them as painless as a vaccination. Residents would get blood-typed by having their finger pricked and a tiny droplet smeared on a card. From there, they would be tattooed with the ABO blood group and Rhesus factor (or Rh factor), which denotes whether or not a person has a certain type of blood protein present.

The Chicago Medical Society and the Board of Health endorsed the program and citizens voiced a measure of support for it. One letter to the editor of The Plainfield Courier-News in New Jersey speculated it might even be a good idea to tattoo Social Security numbers on people's bodies to make identification easier.

Despite such marked enthusiasm, the project never entered into a pilot testing stage in Chicago.

Officials with the Lake County Medical Society in nearby Lake County, Indiana were more receptive to the idea. In the spring of 1951, 5000 residents were blood-typed using the card method. But, officials cautioned, the cards could be lost in the chaos of war or even the relative quiet of everyday life. Tattoos and dog tags were encouraged instead. When 1000 people lined up for blood-typing at a county fair, two-thirds agreed to be tattooed as part of what the county had dubbed "Operation Tat-Type." By December 1951, 15,000 Lake County residents had been blood-typed. Roughly 60 percent opted for a permanent marking.

The program was so well-received that the Lake County Medical Society quickly moved toward making children into mobile blood bags. In January 1952, five elementary schools in Hobart, Indiana enrolled in the pilot testing stage. Children were sent home with permission slips explaining the effort. If parents consented, students would line up on appointed tattoo days to get their blood typed with a finger prick. From there, they’d file into a room—often the school library—set up with makeshift curtains behind which they could hear a curious buzzing noise.

When a child stepped inside, they were greeted by a school administrator armed with indelible ink and wielding a Burgess Vibrotool, a medical tattoo gun featuring 30 to 50 needles. The child would raise their left arm to expose their torso (since arms and legs might be blown off in an attack) and were told the process would only take seconds.

A child raises his hand in class circa the 1950s
Vecchio/Three Lions/Getty Images

Some children were stoic. Some cried before, during, or after. One 11-year-old recounting her experience with the program said a classmate emerged from the session and promptly fainted. All were left with a tattoo less than an inch in diameter on their left side, intentionally pale so it would be as unobtrusive as possible.

At the same time that grade schoolers—and subsequently high school students—were being imprinted in Indiana, kids in Cache and Rich counties in Utah were also submitting to the program, despite potential religious obstacles for the region's substantial Mormon population. In fact, Bruce McConkie, a representative of the Church of Jesus Christ of Latter-Day Saints, declared that blood-type tattoos were exempt from the typical prohibitions on Mormons defacing their bodies, giving the program a boost among the devout. The experiment would not last much longer, though.

 

By 1955, 60,000 adults and children had gotten tattooed with their blood types in Lake County. In Milford, health officials persisted in promoting the program widely, offering the tattoos for free during routine vaccination appointments. But despite the cooperation exhibited by communities in Indiana and Utah, the programs never spread beyond their borders.

The Korean conflict had come to an end in 1953, reducing the strain put on blood supplies and along with it, the need for citizens to double as walking blood banks. More importantly, outside of the program's avid boosters, most physicians were extremely reticent to rely solely on a tattoo for blood-typing. They preferred to do their own testing to make certain a donor was a match with a patient.

There were other logistical challenges that made the program less than useful. The climate of a post-nuclear landscape meant that bodies might be charred, burning off tattoos and rendering the entire operation largely pointless. With the Soviet Union’s growing nuclear arsenal—1600 warheads were ready to take to the skies by 1960—the idea of civic defense became outmoded. Ducking and covering under desks, which might have shielded some from the immediate effects of a nuclear blast, would be meaningless in the face of such mass destruction.

Programs like tat-typing eventually fell out of favor, yet tens of thousands of adults consented to participate even after the flaws in the program were publicized, and a portion allowed their young children to be marked, too. Their motivation? According to Carol Fischler, who spoke with the podcast 99% Invisible about being tattooed as a young girl in Indiana, the paranoia over the Cold War in the 1950s drowned out any thought of the practice being outrageous or harmful. Kids wanted to do their part. Many nervously bit their lip but still lined up with the attitude that the tattoo was part of being a proud American.

Perhaps equally important, children who complained of the tattoo leaving them particularly sore received another benefit: They got the rest of the afternoon off.

SECTIONS

arrow
LIVE SMARTER