The Feminine Mystique

Thomas Allen
Thomas Allen

By Brittany Shoot

Betty Friedan was always cold. Cooped up in a rented stone house, the onetime newspaper reporter wore gloves at her typewriter, laboring over freelance articles in the quiet moments she could catch between tending to her two grade-school boys.
Her husband, Carl, was more than unsupportive—he was abusive, a cheat who flew into a rage whenever dinner was delayed. But Friedan, who was pregnant with their third child, knew that escaping the marriage would be difficult. Cut off from Manhattan and even from the nearest library, the freelance work she attracted didn’t pay well enough to make leaving an option. Mostly, she wrote for other reasons. Once a brilliant academic with a promising career, Friedan was stuck in housewife hell, bored out of her mind. She needed the escape.

In 1957, Friedan picked up an assignment from her college alumni magazine. It seemed fun. What she didn’t know was that the project would not only make her a household name—it would change the fate of American women.

"Just Be A Woman"

Born and raised in Peoria, Ill., Bettye Goldstein was a gifted student. She skipped second grade and eventually graduated with honors from Smith College, where she was an outspoken war critic and the editor in chief of the school newspaper. From there, her academic dreams took her to the University of California, Berkeley, where she studied under the renowned developmental psychologist Erik Erikson.

But even in the Bay Area’s liberal atmosphere, the pressure to conform to the era’s strict gender roles was palpable. Threatened by her success, Friedan’s boyfriend pushed her to turn down a prestigious science fellowship. As she’d later write in her autobiography, Life So Far, “I had given up any idea of a ‘career’, I would ‘just be a woman.’ ” Friedan abandoned her academic pursuits and took a newspaper job. But as her relationship with her boyfriend fizzled, Friedan’s love of reporting grew. When a colleague at UE News, the labor paper she was working for, set her up with his childhood friend, theater director Carl Friedan, they fell for each other. The couple married in 1947 and settled in New York City’s Greenwich Village.

It wasn’t long before the marriage soured. Betty kept up with household chores. She got pregnant. But nothing she did was good enough for Carl. She managed to finagle more than a year of maternity leave from her job after giving birth, but when she became pregnant again two years later, the union refused her additional leave. Instead, she was fired on the spot.

Meanwhile, the Friedans needed more space for their expanding family. They rented a stone barn–turned-house in Rockland County, 30 miles outside Manhattan. Shortly after their move, Carl became abusive. Isolated in the suburbs, Betty continued to squeeze in time for freelance work. As tension escalated, Betty stood her ground—if she was going to free herself from her husband, she’d need to earn more money.

With her 15-year college reunion approaching, Friedan was asked to conduct a survey of her Smith classmates. How had her fellow alumnae used their education? How satisfied were they with their lives? Collaborating with two friends, she crafted open-ended questions to elicit honest reactions from the more than 200 women to whom she sent surveys.

Friedan hoped the data might refute the findings in Ferdinand Lundberg and Dr. Marynia Farnham’s popular book Modern Woman: The Lost Sex, which made arguments like “The more educated the woman is, the greater chance there is of sexual disorder.” She knew education didn’t cause women’s sexual dysfunction, but how could she prove it?

As the completed surveys poured in, Friedan got her answer: The forms were filled with heartbreak and honesty. Women from all over the country confided the abject misery of their everyday lives, and the answers betrayed widespread feelings of resentment and isolation. Many women said they were undergoing psychoanalysis but said the treatments were only making their symptoms worse. Most male doctors were telling their female patients that the complaints were unwarranted or expected. Indeed, Lundberg and Farnham considered these complaints part of “a deep illness that encouraged women to assume the male traits of aggression, dominance, independence, and power.” Many doctors even urged patients to dive deeper into domesticity and to more fully embrace chores as a source of self-actualization. And yet, in their answers, none of the women extolled the virtues of vacuuming.
As Friedan read the reports, she thought about the ads that bombarded women on a daily basis: Be a supportive wife! Cook better meals! Scrub that tub! The messaging in women’s magazines was as biased as the doctors’. No wonder women felt trapped. Each was convinced that she was the only woman in the world who couldn’t find joy hiding beneath a stack of dirty dishes.

Armed with the survey results and her own media analysis, Friedan headed to Smith for the 1957 reunion. There, she planned to report her findings and speak in-depth with her former classmates about their collective ennui. But she was startled by the scene on campus: None of the current students she spoke with seemed keen to pursue interests or careers outside of suburbia. Perhaps they were buying into the arguments that magazines like Look were promoting at the time, stating that the modern housewife “marries younger than ever, bears more babies, looks and acts more feminine than the emancipated girl of the Twenties or Thirties.” The young women at Smith seemed more accepting of “their place” than when Friedan had graduated, a decade and a half earlier.

Place Holders

It was clear to Friedan that she had uncovered a major crisis facing middle-class American women, but you wouldn’t have known it from the reaction she received. Academics were skeptical and outright dismissive of her survey results. Magazine editors (most of whom were men) were uninterested in challenging the status quo—or sacrificing advertising revenue for the sake of a story. A handful of editors initially bought her pitches, only to deem the finished pieces too scandalous to publish. At Ladies’ Home Journal, editors reframed one of her articles to say the exact opposite of what Friedan had found, so she killed the story.

Friedan soldiered on. She conducted more interviews with alumni groups and students at other schools, neighbors, counselors, and doctors. She published where she could. Eventually, she persuaded Good Housekeeping to give her a platform by agreeing to play by its rules: Every column had to be presented with an optimistic slant. But as she continued to write, it became clear that only a book could adequately describe “the problem that [had] no name.”

In late 1957, Friedan managed to land a $3000 book advance from W. W. Norton. She hired a baby-sitter three days a week and annexed a desk in the New York Public Library’s Allen Room, assuming the book would take a year to complete. She couldn’t have predicted how long her manuscript would hold her hostage.

Five years later, her dogged determination paid off. In 1963, The Feminine Mystique, the now-classic treatise on the pervasive unhappiness of American housewives, made its debut on the New York Times bestseller list. It was the definition of irony. The writer who previously couldn’t publish an article had a book that kept falling off the bestseller list because printers couldn’t keep up with demand. But what was it about the book that made it so compelling? It’s hard to see now, but The Feminine Mystique came out well before psychology was a hip way of examining social phenomena. And even though Friedan leaned heavily on academic research, hers was the first popular examination of women’s depressing post-WWII private lives. Friedan forced America to confront a problem it had all too happily ignored, and, as the New York Times put it, “the portrait she painted was chilling.” The book turned Friedan into an instant celebrity. She went on a nationwide publicity tour, appearing in televised press conferences and doing talk shows. But what the camera didn’t catch was all the heavy makeup Friedan wore to conceal her bruises and black eyes. Life at home had not gotten easier.

Leading the Revolution

Buoyed by her success, Friedan moved back to Manhattan and distanced herself from her husband. Her move coincided with a larger cultural shift, as the women’s movement began to coalesce around the country. Focusing on many of the issues raised in The Feminine Mystique, including sex discrimination, pay equity, and reproductive rights, second-wave feminists won major battles in courtrooms and offices over the next several decades. Sexual discrimination in the workplace was outlawed. Title IX was passed to ensure that girls and women would not be excluded from school athletic programs. Marital rape became a punishable crime. Domestic violence shelters were established for the first time. Contraceptives were made widely available. Abortion was legalized in the United States. As second-wave leaders bulldozed their way through the 1970s, women were finally allowed to sit on courtroom juries in all 50 states, to establish credit without relying on a male relative, and the enlistment qualifications for the Armed Forces became the same for men and women.

Friedan’s leadership was vital in the transformative years that followed her book’s publication. In 1966, she helped found the National Organization for Women (NOW) and campaigned vigorously for Congress to pass the Equal Rights Amendment. And in 1969, a year history remembers as explosive and pop culture considers transcendent, Betty Friedan finally took her own words to heart—freeing herself from her loveless and abusive marriage.

In the ensuing years, Friedan remained involved in the women’s rights movement. She led the 50,000-person Women’s Strike for Equality in 1970. In the following decades, she helped found other notable women’s rights organizations, including the National Women’s Political Caucus. She wrote five more books. And by 2000, The Feminine Mystique had sold more than three million copies and been translated into numerous languages.

When Betty Friedan passed away on her 85th birthday, she was eulogized by NOW cofounder Muriel Fox, who said, “I truly believe that Betty Friedan was the most influential woman, not only of the 20th century, but of the second millennium.” Friedan had started a revolution by asking her friends and contemporaries the simple question no one had been bold enough to ask: Are you happy? And as she worked to answer the question for herself, she freed generations of women to come.

This article originally appeared in mental_floss magazine in our ongoing "101 Masterpieces" series. You can get a free issue here.

Why the Filet-O-Fish Sandwich Has Been on the McDonald's Menu for Nearly 60 Years

McDonald's has introduced and quietly killed many dishes over the years (remember McDonald's pizza?), but there's a core group of items that have held their spot on the menu for decades. Listed alongside the Big Mac and McNuggets is the Filet-O-Fish—a McDonald's staple you may have forgotten about if you're not the type of person who orders seafood from fast food restaurants. But the classic sandwich, consisting of a fried fish filet, tartar sauce, and American cheese on a bun, didn't get on the menu by mistake—and thanks to its popularity around Lent, it's likely to stick around.

According to Taste of Home, the inception of the Filet-O-Fish can be traced back to a McDonald's franchise that opened near Cincinnati, Ohio in 1959. Back then the restaurant offered beef burgers as its only main dish, and for most of the year, diners couldn't get enough of them. Things changed during Lent: Many Catholics abstain from eating meat and poultry on Fridays during the holy season as a form of fasting, and in the early 1960s, Cincinnati was more than 85 percent Catholic. Fridays are supposed to be one of the busiest days of the week for restaurants, but sales at the Ohio McDonald's took a nosedive every Friday leading up to Easter.

Franchise owner Lou Groen went to McDonald's founder Ray Kroc with the plan of adding a meat alternative to the menu to lure back Catholic customers. He proposed a fried halibut sandwich with tartar sauce (though meat is off-limits for Catholics on Fridays during Lent, seafood doesn't count as meat). Kroc didn't love the idea, citing his fears of stores smelling like fish, and suggested a "Hula Burger" made from a pineapple slice with cheese instead. To decide which item would earn a permanent place on the menu, they put the two sandwiches head to head at Groen's McDonald's one Friday during Lent.

The restaurant sold 350 Filet-O-Fish sandwiches that day—clearly beating the Hula Burger (though exactly how many pineapple burgers sold, Kroc wouldn't say). The basic recipe has received a few tweaks, switching from halibut to the cheaper cod and from cod to the more sustainable Alaskan pollock, but the Filet-O-Fish has remained part of the McDonald's lineup in some form ever since. Today 300 million of the sandwiches are sold annually, and about a quarter of those sales are made during Lent.

Other seafood products McDonald's has introduced haven't had the same staying power as the Filet-O-Fish. In 2013, the chain rolled out Fish McBites, a chickenless take on McNuggets, only to pull them from menus that same year.

[h/t Taste of Home]

The Disturbing Reason Schools Tattooed Their Students in the 1950s

Kurt Hutton, Hulton Archive/Getty Images
Kurt Hutton, Hulton Archive/Getty Images

When Paul Bailey was born at Beaver County Hospital in Milford, Utah on May 9, 1955, it took less than two hours for the staff to give him a tattoo. Located on his torso under his left arm, the tiny marking was rendered in indelible ink with a needle gun and indicated Bailey’s blood type: O-Positive.

“It is believed to be the youngest baby ever to have his blood type tattooed on his chest,” reported the Beaver County News, cooly referring to the infant as an “it.” A hospital employee was quick to note parental consent had been obtained first.

The permanent tattooing of a child who was only hours old was not met with any hysteria. Just the opposite: In parts of Utah and Indiana, local health officials had long been hard at work instituting a program that would facilitate potentially life-saving blood transfusions in the event of a nuclear attack. By branding children and adults alike with their blood type, donors could be immediately identified and used as “walking blood banks” for the critically injured.

Taken out of context, it seems unimaginable. But in the 1950s, when the Cold War was at its apex and atomic warfare appeared not only possible but likely, children willingly lined up at schools to perform their civic duty. They raised their arm, gritted their teeth, and held still while the tattoo needle began piercing their flesh.

 

The practice of subjecting children to tattoos for blood-typing has appropriately morbid roots. Testifying at the Nuremberg Tribunal on War Crimes in the 1940s, American Medical Association physician Andrew Ivy observed that members of the Nazi Waffen-SS carried body markings indicating their blood type [PDF]. When he returned to his hometown of Chicago, Ivy carried with him a solution for quickly identifying blood donors—a growing concern due to the outbreak of the Korean War in 1950. The conflict was depleting blood banks of inventory, and it was clear that reserves would be necessary.

School children sit next to one another circa the 1950s
Reg Speller, Fox Photos/Getty Images

If the Soviet Union targeted areas of the United States for destruction, it would be vital to have a protocol for blood transfusions to treat radiation poisoning. Matches would need to be found quickly. (Transfusions depend on matching blood to avoid the adverse reactions that come from mixing different types. When a person receives blood different from their own, the body will create antibodies to destroy the red blood cells.)

In 1950, the Department of Defense placed the American Red Cross in charge of blood donor banks for the armed forces. In 1952, the Red Cross was the coordinating agency [PDF] for obtaining blood from civilians for the National Blood Program, which was meant to replenish donor supply during wartime. Those were both measures for soldiers. Meanwhile, local medical societies were left to determine how best to prepare their civilian communities for a nuclear event and its aftermath.

As part of the Chicago Medical Civil Defense Committee, Ivy promoted the use of the tattoos, declaring them as painless as a vaccination. Residents would get blood-typed by having their finger pricked and a tiny droplet smeared on a card. From there, they would be tattooed with the ABO blood group and Rhesus factor (or Rh factor), which denotes whether or not a person has a certain type of blood protein present.

The Chicago Medical Society and the Board of Health endorsed the program and citizens voiced a measure of support for it. One letter to the editor of The Plainfield Courier-News in New Jersey speculated it might even be a good idea to tattoo Social Security numbers on people's bodies to make identification easier.

Despite such marked enthusiasm, the project never entered into a pilot testing stage in Chicago.

Officials with the Lake County Medical Society in nearby Lake County, Indiana were more receptive to the idea. In the spring of 1951, 5000 residents were blood-typed using the card method. But, officials cautioned, the cards could be lost in the chaos of war or even the relative quiet of everyday life. Tattoos and dog tags were encouraged instead. When 1000 people lined up for blood-typing at a county fair, two-thirds agreed to be tattooed as part of what the county had dubbed "Operation Tat-Type." By December 1951, 15,000 Lake County residents had been blood-typed. Roughly 60 percent opted for a permanent marking.

The program was so well-received that the Lake County Medical Society quickly moved toward making children into mobile blood bags. In January 1952, five elementary schools in Hobart, Indiana enrolled in the pilot testing stage. Children were sent home with permission slips explaining the effort. If parents consented, students would line up on appointed tattoo days to get their blood typed with a finger prick. From there, they’d file into a room—often the school library—set up with makeshift curtains behind which they could hear a curious buzzing noise.

When a child stepped inside, they were greeted by a school administrator armed with indelible ink and wielding a Burgess Vibrotool, a medical tattoo gun featuring 30 to 50 needles. The child would raise their left arm to expose their torso (since arms and legs might be blown off in an attack) and were told the process would only take seconds.

A child raises his hand in class circa the 1950s
Vecchio/Three Lions/Getty Images

Some children were stoic. Some cried before, during, or after. One 11-year-old recounting her experience with the program said a classmate emerged from the session and promptly fainted. All were left with a tattoo less than an inch in diameter on their left side, intentionally pale so it would be as unobtrusive as possible.

At the same time that grade schoolers—and subsequently high school students—were being imprinted in Indiana, kids in Cache and Rich counties in Utah were also submitting to the program, despite potential religious obstacles for the region's substantial Mormon population. In fact, Bruce McConkie, a representative of the Church of Jesus Christ of Latter-Day Saints, declared that blood-type tattoos were exempt from the typical prohibitions on Mormons defacing their bodies, giving the program a boost among the devout. The experiment would not last much longer, though.

 

By 1955, 60,000 adults and children had gotten tattooed with their blood types in Lake County. In Milford, health officials persisted in promoting the program widely, offering the tattoos for free during routine vaccination appointments. But despite the cooperation exhibited by communities in Indiana and Utah, the programs never spread beyond their borders.

The Korean conflict had come to an end in 1953, reducing the strain put on blood supplies and along with it, the need for citizens to double as walking blood banks. More importantly, outside of the program's avid boosters, most physicians were extremely reticent to rely solely on a tattoo for blood-typing. They preferred to do their own testing to make certain a donor was a match with a patient.

There were other logistical challenges that made the program less than useful. The climate of a post-nuclear landscape meant that bodies might be charred, burning off tattoos and rendering the entire operation largely pointless. With the Soviet Union’s growing nuclear arsenal—1600 warheads were ready to take to the skies by 1960—the idea of civic defense became outmoded. Ducking and covering under desks, which might have shielded some from the immediate effects of a nuclear blast, would be meaningless in the face of such mass destruction.

Programs like tat-typing eventually fell out of favor, yet tens of thousands of adults consented to participate even after the flaws in the program were publicized, and a portion allowed their young children to be marked, too. Their motivation? According to Carol Fischler, who spoke with the podcast 99% Invisible about being tattooed as a young girl in Indiana, the paranoia over the Cold War in the 1950s drowned out any thought of the practice being outrageous or harmful. Kids wanted to do their part. Many nervously bit their lip but still lined up with the attitude that the tattoo was part of being a proud American.

Perhaps equally important, children who complained of the tattoo leaving them particularly sore received another benefit: They got the rest of the afternoon off.

SECTIONS

arrow
LIVE SMARTER