When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

Traumatic Episodes: A History of the ABC Afterschool Special

BCI / Sunset Home Visual Entertainment via Amazon
BCI / Sunset Home Visual Entertainment via Amazon

My Dad Lives in a Downtown Hotel. The Toothpaste Millionaire. Me and Dad’s New Wife. She Drinks a Little. Please Don’t Hit Me, Mom. High School Narc. Don’t Touch. From 1972 to 1996, no topic was too taboo for the ABC Afterschool Special, an anthology series that aired every other Wednesday at 4 p.m. Each of the standalone, hour-long installments highlighted issues facing teens and young adults, from underage drinking to the stress of living in a foster home. For the millions of viewers tuning in, it might have been their first exposure to a difficult topic—or the first indication that they weren’t alone in their struggle.

The Afterschool Special originated in the early 1970s, when programming executives at ABC had an epiphany: While there was a lot of content for families and adults during primetime, soap operas for adults in the daytime, and cartoons for children on Saturday mornings, there was relatively little content directed specifically at teenagers and pre-teens. The network saw an opportunity to fill that gap by airing topical specials midweek, when parents watching General Hospital might leave the television on and stick around to watch some TV with their adolescent children.

Initially, the network solicited a mix of fanciful stories and serious, issue-based melodramas. In the animated Incredible, Indelible, Magical Physical Mystery Trip, two kids were shrunk down to the size of a cell to travel through their uncle’s body. In Follow the Northern Star, a boy ushers a friend through the Underground Railroad to escape slavery.

 

Not long after the series debuted in the fall of 1972, ABC executives—including Brandon Stoddard, who was initially in charge of the show and was later responsible for getting the landmark 1977 miniseries Roots and David Lynch's quirky Twin Peaks onto the air—realized that the more puerile stories may have been working against them.

According to Martin Tahse, a producer on dozens of these specials, it was rare for older teens to watch programming intended for younger children. Pre-teens, on the other hand, would watch content meant for an older audience. By season three, the specials were largely made up of topical content. In The Skating Rink, a teen skater overcomes shyness borne out of stuttering. In The Bridge of Adam Rush, a teen copes with a cross-country move after his mother remarries.

The ABC Afterschool Special was an immediate hit, drawing an average of 9.4 million viewers between 1972 and 1974. Many episodes were based on young adult novels, like Rookie of the Year, which stars Jodie Foster as a girl struggling to find acceptance on a boys’ Little League team, or Sara’s Summer of the Swans, about a young woman searching for her missing, mentally challenged brother.

The series also sourced material from magazine articles, short stories, and other venues. For 1983’s The Wave, which originally aired on ABC in primetime in 1981, the story of a high school teacher who describes fascism and Hitler’s rise to power by successfully convincing his students to subscribe to a dictatorial rule, was based on the real experiences of Palo Alto teacher Ron Jones.

The effect of the topical episodes could be potent. For a 1985 special titled One Too Many, which starred Val Kilmer as an underage drinker and Michelle Pfeiffer as his girlfriend, one viewer wrote in to the Los Angeles Times to explain how the show had impacted her:

After watching the ABC Afterschool Special titled One Too Many, a story of drinking and driving, I realized I have taken too many chances with my life. I always think I can handle myself and my car after I’ve had something to drink. Nothing has happened to me … yet. I’d like to thank ABC for showing a program that could possibly save the lives of my friends and me. I’ve realized that drinking and driving is not worth the price of life.

 

As Tahse explained to interviewer Kier-La Janisse, the specials resonated with kids because they rarely indulged in what could be considered a fairy tale ending. “It had to be real,” he said. “If kids watched any of my three specials dealing with alcoholic parents, they were never given a fairy tale ending. I saw to that, because I came from an alcoholic father and knew all the tricks and I wanted the kids who watched—many dealing with the same problem or having friends who had alcoholic parents—to know how it really is.”

The shows also picked up their share of awards. One installment, the self-explanatory Andrea’s Story: A Hitchhiking Tragedy, won five Daytime Emmys in 1984, a third of all the Daytime Emmys ABC won that year. A Special Gift, a 1979 show about a basketball player who takes up ballet, won a Peabody Award.

By the mid-1980s, the specials attempted to strike more of a balance between morality plays and lighthearted fare. The 1984-1985 season consisted of seven episodes, including three comedies and one musical. In The Almost Royal Family, Sarah Jessica Parker stars as a teen whose family buys a home outside the jurisdiction of Canada and the U.S. In Mom’s on Strike, an overworked mother decides to suspend her duties until her family can appreciate her contributions.

Gradually, the specials began leaning back toward hot-button topics. Oprah Winfrey’s Harpo Productions took over producing the series in 1991. That season, Winfrey introduced the episodes, including two panel discussions about relationships and race relations. Though the series did revert back to fictional narratives, it gradually lost its footing in the wake of shows that had a more adolescent bent. A “Very Special Episode” of Beverly Hills, 90210 or Family Matters was essentially a stealth afterschool special. The series was canceled in 1996.

That the show endured for nearly a quarter of a century is a testament to the craftsmanship of producers like Tahse and the support of ABC, who rarely shied away from difficult topics. Still, Tahse—who died in 2014—believed that the series' broad appeal went beyond that.

“The only rule of storytelling that ABC required we follow was … the kid always had to figure out what to do and do it,” he said. “No finger-waving by parents, no lectures by parents. It was a kid who was in a situation and found, through his or her own efforts, a solution.”

Batmania: When Batman Ruled the Summer of 1989

JD Hancock, Flickr // CC BY 2.0
JD Hancock, Flickr // CC BY 2.0

“Flop” is how marketing research group Marketing Evaluation Inc. assessed the box office potential of the 1989 Warner Bros. film Batman. The big-budget production, directed by Tim Burton and co-starring Michael Keaton as Batman and Jack Nicholson as the Joker, was expected to be one of the rare times a major Hollywood studio took a comic book adaptation seriously. But according to the marketing data, the character of Batman was not as popular as the Incredible Hulk, who was then appearing in a slate of made-for-television movies. And he was only a quarter as appealing as the California Raisins, the claymation stars of advertising.

That prediction was made in 1988. The film was released on June 23, 1989, and went on to gross $253.4 million, making it the fifth most successful motion picture up to that point.

While Marketing Evaluation may have miscalculated the movie’s potential, they did hedge their bet. By the time profits from the movie’s merchandising—hats, shirts, posters, toys, bed sheets, etc.—were tallied, the company said, Warner Bros. could be looking at a sizable haul.

When the cash registers stopped ringing, the studio had sold $500 million in tie-in products, which was double the gross of the film itself.

In 1989, people didn’t merely want to see Batman—they wanted to wear the shirts, eat the cereal, and contemplate, if only for a moment, putting down $499.95 for a black denim jacket studded with rhinestones.

Batmania was in full swing. Which made it even more unusual when the studio later claimed the film had failed to turn a profit.

 

The merchandising blitz of Star Wars in 1977 gave studios hope that ambitious science-fiction and adventure movies would forever be intertwined with elaborate licensing strategies. George Lucas's space opera had driven audiences into a frenzy, leading retailers to stock up on everything from R2-D2 coffee mugs to plastic lightsabers. It was expected that other “toyetic” properties would follow suit.

They didn’t. Aside from 1982’s E.T., there was no direct correlation between a film’s success and demand for ancillary product. In 1984 alone, Gremlins, Ghostbusters, and Indiana Jones and the Temple of Doom were smash hits. None of them motivated people to flock to stores and buy Gizmo plush animals or toy proton packs. (Ghostbusters toys eventually caught on, but only after an animated series helped nudge kids in their direction.)

Warner Bros. saw Batman differently. When the script was being developed, producers Jon Peters and Peter Guber were urging writers to make sure scenes were aligned with planned merchandising. They scribbled notes insisting that no onscreen harm come to the Batmobile: It should remain pristine so that kids would want to grab the toy version. As Batman, millionaire Bruce Wayne had a collection of vehicles and gadgets at his disposal—all props that could be replicated in plastic. Batman's comic book origins gave him a unique iconography that lent itself to flashy graphic apparel.

In March 1989, just three months before the film's release, Warner Bros. announced that it was merging with Time Inc. to create the mega-conglomerate Time-Warner, which would allow the film studio to capitalize on a deep bench of talent to help drive the “event” feel of the film.

Prince was signed to Warner's record label and agreed to compose an album of concept music that was tied to the characters; “Batdance" was among the songs and became a #1 hit. Their licensing arm, Licensing Corporation of America, contracted with 300 licensees to create more than 100 products, some of which were featured in an expansive brochure that resembled a bat-eared Neiman Marcus catalog. The sheer glut of product became a story, as evidenced by this Entertainment Tonight segment on the film's licensing push:

In addition to the rhinestone jacket, fans could opt for the Batman watch ($34.95), a baseball cap ($7.95), bicycle shorts ($26.95), a matching top ($24.95), a model Batwing ($29.95), action figures ($5.95), and a satin jacket modeled by Batman co-creator Bob Kane ($49.95).

The Batman logo became a way of communicating anticipation for the film. The virtually textless teaser poster, which had only the June 23 opening date printed on it, was snapped up and taped to walls. (Roughly 1200 of the posters sized for bus stops and subways were stolen, a crude but effective form of market research.) In barber shops, people began asking to have the logo sheared into the sides of their heads. The Batman symbol was omnipresent. If you had forgotten about the movie for even five minutes, someone would eventually walk by sporting a pair of Batman earrings to remind you.

At Golden Apple Comics in Los Angeles, 7000 packs of Batman trading cards flew out the door. Management hired additional staff and a security guard to handle the crowds. The store carried 36 different kinds of Batman T-shirts. Observers compared the hysteria to the hula hoop craze of the 1950s.

One retailer made a more contemporary comparison. “There’s no question Batman is the hottest thing this year,” Marie Strong, manager of It’s a Small World at a mall in La Crosse, Wisconsin, told the La Crosse Tribune. “[It’s] the hottest [thing] since Spuds McKenzie toward the end of last year.”

 

By the time Batman was in theaters and breaking records—it became the first film to make $100 million in just 10 days, alerting studios to the idea of short-term profits—the merchandising had become an avalanche. Stores that didn’t normally carry licensed goods, like Macy’s, set up displays.

Not everyone opted for officially-licensed apparel: U.S. marshals conducted raids across the country, seizing more than 40,000 counterfeit Batman shirts and other bogus items.

Collectively, Warner raked in $500 million from legitimate products. In 1991, the Los Angeles Times reported that the studio claimed only $2.9 million in profit had been realized from merchandising and that the movie itself was in a $35.8 million financial hole owing to excessive promotional and production costs. It was a tale typical of creative studio accounting, long a method for avoiding payouts to net profit participants. (Nicholson, whose contract stipulated a cut of all profits, earned $50 million.)

Whatever financial sleight-of-hand was implemented, Warner clearly counted on Batman to be a money-printing operation. Merchandising plans for the sequel, 1992’s Batman Returns, were even more strategic, including a tie-in agreement with McDonald’s for Happy Meals. In a meta moment, one deleted script passage even had Batman’s enemies attacking a toy store in Gotham full of Batman merchandise. The set was built but the scene never made it onscreen.

The studio was willing to give Burton more control over the film, which was decidedly darker and more sexualized than the original. Batman Returns was hardly a failure, but merchandising was no longer as hot as it was in the summer of 1989. Instead of selling out of shirts, stores ended up marking down excess inventory. McDonald’s, unhappy with the content of the film, enacted a policy of screening movies they planned to partner with before making any agreements. By the time Warner released 1995’s Batman Forever, the franchise was essentially a feature-length toy commercial.

It paid off. Licensing for the film topped $1 billion. Today, given the choice between a film with Oscar-level prestige or one with the potential to have its logo emblazoned on a rhinestone jacket that people would actually want to buy, studios would probably choose the latter. In that sense, the Batmania of 1989 endures.

SECTIONS

arrow
LIVE SMARTER