He's Also a Client: The Saga of Sy Sperling's Hair Club

Upper Playground, YouTube
Upper Playground, YouTube

Divorced, depressed, and with his midsection growing, Sy Sperling stood in front of a mirror at his home in Long Island in the late 1960s and adjusted his hair. It wasn’t his hair, exactly, but a toupee purchased for the express purpose of obscuring his prematurely shiny crown.

Though he was only 26, Sperling had been losing his hair for years. Now that he was newly single, he felt self-conscious about his receding hairline, believing it would diminish his chances with the opposite sex. He tried combing tufts of hair from the side over to the front. He tried the toupee, which looked like a road-flattened beaver. He tried weaving, which knitted human locks to his existing strands; the first time he shampooed it, it collapsed into a ball of knotted hair.

Like many pioneering spirits before him, Sperling imagined that there had to be a better way—a solution to regaining his lost self-confidence and living the life he desired.

In the coming years, Sperling and his second wife, a hairstylist, would perfect an existing approach with irresistible marketing that provided a solution for millions of follicle-deprived individuals everywhere. And much of that success came from Sperling admitting that he was not just the president. He was also a client.

 
 

Baldness “cures” date back to the most ancient civilizations. Egyptians used hippopotamus and crocodile fat as hair growth stimulants. In Rome, burning donkey genitals and mixing the ashes with urine was believed to help grow luscious locks. Various concoctions involving poop were believed to work, too.

In more enlightened times, thinning hair could be addressed with transplantation surgery. In 1939, a Japanese dermatologist extracted hair-bearing skin and replanted it by punching a small hole on sites affected by burn injuries. This practice was mirrored by Norman Orentreich, a New York dermatologist who successfully planted hairs into a patient with male pattern baldness in the 1950s. Orentreich was the first person to observe that hairs on the sides of the head were largely resistant to shedding and would therefore remain in place when transplanted to the top or front of the head.

For decades, this was a crude surgical practice, giving rise to a number of patients who had hair sparsely transplanted and created a reputation for heads that appeared to be implanted with “plugs.” It wasn’t until the 1990s that transplants could be more densely packed, offering a convincing restoration of the hairline.

For Sperling, who was born in 1942 and in his 20s when his hair loss became apparent, invasive surgery that was still years away from being refined wasn’t an option. After his sister admonished him to “do something” about the thinning hair that was causing him such grief, he went to a hairstylist who recommended weaving. While somewhat effective, this only seemed practical if hair was remaining on top. Toupees were out, as Sperling had a particular concern over solutions that could fall off or become dislodged during more intimate moments.

"If you're dating and going to be having special moments, how do you explain, 'I got to take my hair off now?'" he asked.

Even with its drawbacks, weaving seemed like the best option. After learning the technique from his stylist, Sperling left his job in swimming pool sales and opened his own salon on New York City's Madison Avenue in 1968. Using $10,000 in capital from credit cards, he leased a vacant business that already had barber-style chairs. Soon, he and his new wife, Amy—who, it turned out, was indifferent to his hair shortage—perfected a technique in which they used a nylon mesh fitted to the scalp. The net-like fabric allowed the head to breathe and for hairs to grow out from under it. It also acted as a base for human hair strands to be woven on top and secured with a polymer adhesive. The entire “system” was secured to the client by weaving the mesh into the hair on the sides. The result was a relatively natural-looking addition that would remain in place through showering, exercising, and—key for Sperling—sexual activity.

The approach took off, enticing New Yorkers and celebrities alike. (Sperling later insisted Jimi Hendrix came in for a fitting in 1969.) Sperling’s business grew steadily throughout the 1970s, but by 1979, sales were leveling off. The problem was that even though he had happy customers, they were reticent to tell friends about their hair-replacement efforts, so word-of-mouth was not reliable. That’s when Sperling decided to advertise.

 
 

Sperling’s business, then known as the Hair Club for Men, debuted on national television in 1982. One early campaign featured testimonials from actual customers, but the response was minimal. Producers had shot a second spot featuring Sperling himself and considered it as a back-up plan in case the first approach failed. The infomercial aired late at night, when advertising time was cheapest.

Though Sperling was no trained actor or orator, he was genuine. “I’m not just the president,” he said. “I’m also a client.”

When it aired, the reaction was immediate. The Hair Club got 10,000 calls in a month. Interested parties received a brochure discussing various hair-system options and why Sperling’s approach worked. By 1991, there were 40 franchise locations, where clients paid between $2000 and $3500 for a custom mesh that used colored and textured hair to match their natural growth. A maintenance appointment every two months cost $65.

By 1993, the commercial was airing 400 times a day, costing Sperling $12 million annually in advertising expenses. But it was drawing up to $100 million annually in sales. In admitting what most men wouldn't, Sperling engendered trust—and profit.

 
 

Later, the Hair Club for Men would undergo several cosmetic alterations to its business model. Sperling moved away from strip-mall locations for his clinics and into commercial office spaces to help provide discretion. He even used initials—HCM—on signage to promote privacy.

The “For Men” was dropped as more women suffering from hair loss due to genetics or illness came looking for assistance. Sperling also provided assistance to kids with cancer diagnoses. Through it all, he sold something more than polymers and mesh: Hair Club trafficked in confidence and self-esteem. He allowed reporters to tug on his own hair as a demonstration of quality. It would barely move. "Not bad, eh?" he asked a Spy journalist in 1991. "It really is an amazing transformation."

The hair stayed in place, but Sperling didn’t. In 2000, he sold Hair Club for $45 million to a group of investors who turned around and sold it in 2005 to the Regis hair company for $210 million. Today, Hair Club still offers solutions similar to what Sperling marketed, as well as proven topical treatments like Rogaine (minoxidil), laser combs purported to stimulate growth, and transplantation surgery.

Sperling had an impressive 15-year non-compete clause for the initial sale, though he hasn’t yet announced any plans to get back into the hair-boosting business. Still, photographs of Sterling from earlier this year show that the septuagenarian still has a full head of hair.

When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

When Mr. Rogers Taught Kids About Mutually Assured Nuclear Destruction

Focus Features
Focus Features

After months of hype, the ABC television network premiered a made-for-TV film titled The Day After on November 20, 1983. Presented with minimal commercial interruption, the two-hour feature illustrated a world in which both the United States and Russia made the cataclysmic decision to launch nuclear missiles. The blasts wiped a small town off the face of the Earth; the few who did survive writhed in pain, with their skin hanging off in clumps.

The imagery was graphic and unsettling, and it was supposed to be. Director Nicholas Meyer wanted to portray the fallout in sober detail. The Day After drew a sizable viewership and was hailed as a responsible use of television in order to educate audiences about the reality of the tension between the world’s superpowers.

In the weeks before the film premiered, though, another prominent broadcast was exploring the same themes. It was intended for young audiences and explored—via the use of puppets—the consequences of international aggression. For five episodes across one week, the threat of nuclear annihilation was looming in Mister Rogers’ Neighborhood.

A nuclear explosion creates a mushroom cloud
iStock.com/RomoloTava-ni

Since its inception on Pittsburgh's WQED in 1968, Mister Rogers’ Neighborhood had informed its young audience about topical issues in subversive and disarming ways. When civil rights were discussed, host Fred Rogers didn’t deliver a lecture about tolerance. Instead, he invited a black friend, Officer Clemmons, to cool off in his inflatable pool, a subtle nod to desegregation. In 1981, Rogers—the subject of this year's critically-acclaimed documentary, Won't You Be My Neighbor?explored the topic of divorce with puppet Patty Barcadi, whose parents had separated. Rogers comforts Prince Tuesday, who frets his own parents might split. Famously, Rogers also explored the subject of individuals with disabilities with the introduction of Jeff Erlanger, who became a quadriplegic at a young age after undergoing spinal surgery to remove a tumor. (Decades later, the two were reunited when Erlanger made a surprise appearance as Rogers was being inducted into the Television Academy Hall of Fame.)

Despite Rogers's history tackling tough topics, there was perhaps no greater a hot-button issue for the children’s show to tackle than nuclear war. Rogers wanted to address what he felt was a growing concern among schoolchildren who processed Cold War headlines and interpreted tensions between Russia and the U.S. as potentially disastrous. (In one survey of classrooms across several major cities, students labeled the possibility of nuclear war “likely.”)

Rogers conceived and taped a five-episode storyline on the subject in the summer of 1983, which wound up being prescient. In November 1983, president Ronald Reagan ordered the invasion of Grenada to topple a Marxist regime.

“Little did I know we would be involved in a worldwide conflict now,” Rogers told the Associated Press. “But that’s all the better because our shows give families an opportunity for communication. If children should hear the news of war, at least they have a handle here, to assist in family communications.”

In the five-part series titled “Conflict,” Rogers again turned to the puppets that populated his Neighborhood of Make-Believe. Provincial ruler King Friday (voiced by Rogers) is handed a “computer read-out” that tips him off to some counterintelligence: Cornflake S. Pecially, ruler of the neighboring land of Southwood, is allegedly making bombs. In a panic, King Friday orders his underlings to do the same, mobilizing efforts to make certain they can match Southwood’s fiery super weapons—even if it means not having the financial resources to care for his people in other ways.

Lady Elaine Fairchilde and Lady Aberlin aren’t quite convinced. Rather than succumb to paranoia, they decide to travel to Southwood to see for themselves. They find its citizens building a bridge, not a bomb. A misunderstanding had almost led to unnecessary violence.

Of course, no mushroom clouds envelop the Neighborhood of Make-Believe, and none of the puppets suffer the devastating effects of radiation poisoning. Rogers wasn’t even claiming the story was necessarily about war, but the prevention of it.

“This show gives us a chance to talk about war, and about how it’s essential that people learn to deal with their feelings and to talk about things and resolve conflicts,” he said.

A publicity photo of Fred Rogers for 'Mr Rogers' Neighborhood'
Getty Images

The episodes sparked conversation in classrooms, where some teachers used the footage to broach the subject. At an elementary school in Venetia, Pennsylvania, students in a third-grade social studies class discussed the consequences of war. “No water” was one response. “Injuries” was another.

Unlike The Day After, which one psychiatrist declared as inappropriate for children under 12, Rogers proved it was possible to provoke conversation without rattling any nerves.

Following their initial run in 1983, the five-part “Conflict” episodes have never been repeated. The close of the 1980s saw a reduction in concerns over nuclear attacks, and it’s possible producers of Mister Rogers’ Neighborhood regarded the shows as dated.

They resurfaced briefly on YouTube in 2017 before vanishing. The series was subsequently uploaded to a Dailymotion video account in 2018. Like The Day After, the shows are an interesting time capsule of an era when the fear of devastating conflict was palpable. For a number of kids who experienced that concern, Mr. Rogers helped frame it in a way they could understand.

“I don’t want this to be a frightening thing,” Rogers said. “I want children to know that war is something we can talk about. Whatever is mentionable is manageable.”

SECTIONS

arrow
LIVE SMARTER