When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

The One Where Jennifer Aniston's 'Rachel' Haircut on Friends Became a Phenomenon

NBC Television/Getty Images
NBC Television/Getty Images

The legacy of NBC's Friends isn't one of ratings records or piles of awards—it's about the way the show managed to impact popular culture by showing life at its most mundane. This is a series that turned sipping coffee into an art form, still prompts philosophical debates over the morality of being "on a break," and made it impossible not to shout pivot! when moving furniture. But Friends reached its cultural zenith when it managed to transform a simple hairstyle into a global talking point, as untold millions of women in the ‘90s flocked to salons all wanting one thing: “The Rachel.”

“The Rachel” hairstyle, which was the creation of stylist Chris McMillan, was first worn by Jennifer Aniston’s Friends character Rachel Green in the April 1995 episode “The One With the Evil Orthodontist." It has its roots as a shag cut, layered and highlighted to TV perfection. It may have been a bit too Hollywood-looking for a twenty-something working for tips, but it fit in the world of Friends, where spacious Manhattan apartments could easily be afforded by waitresses and struggling actors.

The Birth of "The Rachel"


Aniston in 1996, during the height of the style.
NBC Universal/Getty Images

The style itself wasn’t designed to grab headlines; McMillan simply gave Aniston this new look to be “a bit different,” as he later told The Telegraph. In hindsight, the ingredients for a style trend were all there: The cut was seen on the show’s breakout star as the series hit its ratings peak; an average of more than 25 million viewers tuned in each week during Friends's first three seasons. You can’t have that many eyeballs on you without fans wanting to get closer to you, and the easiest way to do that is to copy your style.

During the show’s second and third seasons in the mid-1990s, stories began to appear in newspapers and magazines about salons from Los Angeles to New York City and (literally) everywhere in-between being inundated with requests for Aniston's haircut. Some women would come in with their copy of TV Guide in hand for reference; others would record an episode of the show and play it at the salon to ensure accuracy. For these stylists, a good hair day for Rachel on a Thursday night meant big business over the weekend.

"That show has made us a bunch of money," Lisa Pressley, an Alabama hairstylist, said back in 1996. Pressley was giving around four "Rachels" per week to women ages 13 to 30, and she was touching up even more than that. Another hairdresser estimated that, during that time, 40 percent of her business from female clients came from the "Rachel." During the early days of the trend, McMillan even had people flying to his Los Angeles salon to get the hairdo from the man himself—a service that he charged a modest $60 for at the time.

A Finicky 'Do

What many clients learned, though, was that unless you had a trained stylist at your side, “The Rachel” required some real maintenance.

"People don't realize the style is set by her hairdresser," stylist Trevor Tobin told The Kansas City Star in 1995. “She doesn't just wake up, blow it dry, and it just turns out like that."

That was a warning Aniston knew all too well. In recent years, she has expressed her frustration at not being able to do the style on her own; to get it just right, she needed McMillan on hand to go through painstaking styling before shoots. In addition to being impossible to maintain, in a 2011 Allure interview, Aniston called it the “ugliest haircut I've ever seen." In 2015, the actress told Glamour that she found the look itself “cringey."

Though Aniston had grown to loathe the look, it was soon the 1990s' go-to style for other stars like Meg Ryan and Tyra Banks and later adopted by actresses and musicians like Kelly Clarkson and Jessica Alba. Debra Messing had an ill-fated run-in with it when she was told to mimic the style for her role on Will & Grace. They soon realized that trying it without McMillan was a fool’s errand.

“[It] was a whole debacle when we tried to do it on the show,” Messing recalled. “They literally tried for three hours to straighten my hair like [Aniston's]. It was so full and poofy that it looked like a mushroom.”

A Style That Sticks Around

A picture of Jennifer Aniston from 1999.
Aniston sporting her post-"Rachel" hair during the show's sixth season.
NBC Universal/Getty Images

Aniston’s personal preference for longer hair soon made its way on-screen, replacing the shorter, choppier “Rachel” by season 4. The once-iconic look was officially ditched, the last remnants of which were washed away in a flowing sea of ever-growing locks doused in blonde, pin-straight highlights. And once a haircut’s namesake turns their back on the style, it’s likely only a matter of time before the rest of the world moves on, too, right?

Wrong. “The Rachel” endured.

Unlike Farrah Fawcett’s showstopping feathered hair from the ‘70s, celebrities, news anchors, and the average salon-goer were still wearing the hairstyle well into the 2000s. Even now, fashion websites will run the occasional “Is ‘The Rachel’ Making a Comeback?” article, complete with the latest Hollywood star to sport the familiar shag.

It’s a testament to McMillan’s skill, Aniston’s charm, and Friends’s cultural sway over audiences that people are still discussing, and donning, the hairstyle some 25 years later. And in a lot of ways, the haircut's success mimicked the show's: it spawned plenty of imitators, but no one could outdo the original.

A Quick History of Hidden Camera TV Commercials

Consumer Time Capsule, YouTube
Consumer Time Capsule, YouTube

At restaurants like Tavern on the Green in New York and Arnaud’s in New Orleans, diners sitting down for formal meals are seen complimenting the waiter on their coffee. Just a few moments later, they’re informed it wasn’t the “gourmet” brew typically served, but a cup of Folgers Instant coffee that had been “secretly switched.” The surprised patrons then heap praise on their duplicitous waitstaff.

This scene and others like it played out hundreds of times in television commercials throughout the late 1970s and early 1980s. Variations date as far back as the 1950s, and some commercials—like Chevrolet's now-infamous 2017 spot that depicted amazed onlookers marveling at the car company's numerous J.D. Power and Associates Awards—still air with regularity. Instead of using actors, the spots purport to highlight the reaction of genuine consumers to products, often with the use of hidden cameras positioned outside the unsuspecting customers' field of vision.

 

Despite skepticism, the people in these ads are often members of the general public offering their unrehearsed response to beverages, laundry detergents, and automobiles. That doesn’t mean, however, that there’s not a little bit of premeditation going on.

The idea of recording spontaneous reactions for advertising purposes dates back to the 1950s, when Procter & Gamble arranged for housewives to compare the whiteness of laundry washed in their Cheer detergent against the comparatively dingier load that resulted after a soak in the competition. The camera wasn’t “hidden” and the spokesman made no secret of his intentions—he was holding a microphone—but the women were approached in a laundromat and not a casting office. Those who appeared in such spots would receive a $108 fee, along with residuals that could add up to thousands if the commercial aired repeatedly.

This approach was refined by Bob Schwartz, a former director of the prank series Candid Camera. In 1969, Schwartz formed Eyeview Films and worked with ad agencies to capture spontaneous reactions to products. An early spot for the floor cleaner Spic and Span was a hit, and other companies and agencies followed the template. For a 1982 spot, Schwartz set up his crew in a supermarket and invited customers to try Oven Fry, a new frozen chicken product from General Mills. The most expressive reactions (“mmm-mmm!”) were invited to consent to be in the commercial.

In more controlled settings, it’s necessary for advertisers to make sure the pool of potential testimonials is suited for the product. Before filming spots like the Folgers tasting, a team of market research employees typically recruited people by inviting them to take part in polls on the street. They’re asked about coffee preferences—the better to establish whether they even like the beverage—and were then invited to a nearby restaurant for a free meal. Out of two dozen couples selected for a Folgers spot in San Francisco in 1980, two or three were selected for the commercial.

 

The Folgers spots aired for years and were memorable for how surprised people appeared to be that they had just consumed granulated crystals instead of fresh-brewed coffee. But that doesn’t mean viewers necessarily believed their reactions. A 1982 consumer survey found that consumers often found their endorsements too stiff, meaning they were prompted, or too natural, which hinted that they might be actors. Though ad agencies went to great lengths to assure authenticity, their praise made audiences dubious.

Why would non-actors shower products with compliments? It takes a bit of psychology on the part of the ad agencies. For Chevrolet's 2017 spot that was ridiculed for people overreacting to the mere sight of a car, one of the participants—who asked to remain anonymous due to a non-disclosure agreement—told The A.V. Club that the upbeat environment and surreal exposure to a new car after agreeing to take part in a market research survey left his group feeling like it would be rude to say anything negative.

“We never retook a take, but you felt really bad about saying something negative about Chevy because there were 50 cameras on you, and it was just this one [host],” he said. “He did this magic trick of making it seem like you were hurting his feelings if you said anything bad about Chevy. You didn’t want to see this guy stop smiling. It was really bizarre.”

Candid? Sure. As candid as if they were among friends and not a squad of marketing executives? That's a different story.

SECTIONS

arrow
LIVE SMARTER