When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

Spit Take: The Story of Big League Chew

Amazon
Amazon

Rob Nelson watched the kid’s ritual with curiosity. It was the mid-1970s, and he and the kid were in Civic Stadium in Portland, Oregon, both working in the service of the Portland Mavericks, a rogue baseball team operating outside the purview of Major League Baseball. Nelson was a fledging player who sometimes got on the field but mostly stuck to selling tickets and coaching youth baseball camps. The kid, Todd Field, was the batboy. And what Field was doing fascinated Nelson.

Field, who couldn’t have been older than 11 or 12, took a Redman chewing tobacco pouch from his pocket, scooped out of a bunch of gunk, and stuffed it between his cheeks and gumline. Then he’d let the black goo dribble down his chin or hock it in the dirt.

Chewing tobacco was a common sight among the athletes, but Nelson hadn’t seen many kids take up the habit so early. He approached Field and asked if he was dipping, the common parlance for stuffing tobacco in one’s cheek pockets.

Field hocked another glob of brown discharge at the ground. He showed Nelson the tobacco tin, which was full of black licorice. Fields had minced it up so that he could replicate the muddy color of the real thing.

The exchange planted a seed in Nelson's brain. As a kid, he had done something vaguely similar, stuffing his mouth with bubblegum to resemble his idol, Chicago White Sox second baseman Nellie Fox. What if, he wondered, kids could emulate their heroes without the health consequences or parental scorn that accompanied real tobacco?

The package for Big League Chew shredded bubble gum is pictured
Amazon

Not long after, Nelson found himself in the team’s dugout with Jim Bouton, a onetime New York Yankee who had been ostracized for writing a tell-all memoir, Ball Four. Nelson shared his idea for a novelty faux-tobacco product with Bouton, but with something of a twist: Instead of licorice, he would use shredded bubblegum. He might, he said, call it Maverick Chew, or All-Star Chew.

Bouton was intrigued. As the two watched the Mavericks players jog around the field and dip real tobacco (neither man had ever taken up the habit) they agreed it would be an idea worth pursuing. Nelson would develop the product and Bouton would try to get it distributed. Bouton would also be the sole investor, sinking $10,000 into Nelson’s idea.

The Mavericks disbanded in 1977, but the partnership between Nelson and Bouton endured. Nelson, who worked for a pitching machine company, visited Bouton after the pitcher signed with the Atlanta Braves in 1978, and the two conspired further on Nelson’s shredded gum idea. Nelson purchased an at-home gum-making kit that he saw an ad for in the pages of People magazine and got to work producing a batch of the stuff in the kitchen of Field’s parents. Hoping to mimic the tar-like color of Field’s concoction, Nelson used brown food coloring, maple extract, and root beer extract in the gum. The result was predictably terrible.

Despite a lack of a viable prototype gum, Bouton did his part by pitching the idea to several baseball-affiliated companies. (The former Yankee put his own likeness on the mock-up pouch.) Topps and Fleer, which produced bubblegum cards, politely rejected him. He eventually ended up at Amurol, a subsidiary of the Wrigley Company, one of the largest chewing gum conglomerates in the world. In a coincidence, Amurol engineer Ron Ream had been working on a shredded-gum project for several years. Rather than brush Bouton off, the company embraced the idea of a gum that would be sold in a pouch and was a play on kid-friendly chewing tobacco. They even liked the name Nelson had settled on: Big League Chew.

Ream had successfully developed a formula that solved the problem of the tiny ribbons of gum, using enough glycerin to make sure it wouldn’t stick together and become a useless clump in the package. Amurol, however, didn’t take to Nelson’s other big idea, which was to make the gum brown. While the chewing tobacco homage was obvious, they didn’t want to completely replicate the experience. The gum would remain pink.

In 1980, Amurol conducted a sample rollout at a 7-Eleven store in Naperville, Illinois. When executives came back from lunch, the 2.1-ounce pouches had sold out.

That first year, Big League Chew rang up $18 million in sales, capturing 8 percent of the bubblegum market. Amurol’s other products all together hadn’t totaled more than $8 million. (Nelson and Bouton received a percentage of sales.)

Nelson’s hunch had been correct: Kids loved the facsimile chew, which sold for between 59 and 79 cents a pack. Candy distributors in Orlando reported selling 25,000 pouches a week. Copycat products like Chaw came and went. Little Leaguers and amateur ballplayers could take out as much gum as they wanted and stuff the rest in their pockets. But the association with tobacco, which wasn’t meant to be taken literally, upset some parents. They feared Big League Chew could become a "gateway" gum—bubblegum one day, tobacco and oral cancer the next.

Nelson and Amurol took the criticism in stride. Nelson was often quoted as saying he personally detested chewing tobacco and considered this a solution to, not the cause of, a tobacco habit. A California bill that would have banned the gum, candy cigarettes, and other products meant to resemble tobacco died in the state’s Senate Judiciary Committee in 1992. Kids continued to dribble grape, strawberry, and other fruit-flavored gum on their shirts. Amurol experimented with gum branded with Popeye’s likeness, colored green and meant to resemble spinach. It did not enjoy the same success.

Nelson bought out Bouton’s interest in Big League Chew in 2000 and has remained with the brand ever since, including a move from Wrigley—which was sold to Mars Inc. in 2008 for $23 billion—to Ford Gum in 2010. Sales have hovered around $10 to $13 million annually and there have been no confirmed reports of children being indoctrinated into a chewing tobacco habit as a result.

In February 2019, the package depicted its first female player. In the past, it has featured a variety of artwork and the likenesses of several retired players. In 2013, two active players—Matt Kemp of the Los Angeles Dodgers and Cole Hamels from the Philadelphia Phillies (now with the Chicago Cubs)—were pictured. But despite its name, Big League Chew has never had any formal affiliation with Major League Baseball. The MLB has instead maintained relationships with Bazooka and Double Bubble.

The lack of any official MLB endorsement hasn’t hurt. At last count, more than 800 million pouches of Big League Chew have been sold.

Traumatic Episodes: A History of the ABC Afterschool Special

BCI / Sunset Home Visual Entertainment via Amazon
BCI / Sunset Home Visual Entertainment via Amazon

My Dad Lives in a Downtown Hotel. The Toothpaste Millionaire. Me and Dad’s New Wife. She Drinks a Little. Please Don’t Hit Me, Mom. High School Narc. Don’t Touch. From 1972 to 1996, no topic was too taboo for the ABC Afterschool Special, an anthology series that aired every other Wednesday at 4 p.m. Each of the standalone, hour-long installments highlighted issues facing teens and young adults, from underage drinking to the stress of living in a foster home. For the millions of viewers tuning in, it might have been their first exposure to a difficult topic—or the first indication that they weren’t alone in their struggle.

The Afterschool Special originated in the early 1970s, when programming executives at ABC had an epiphany: While there was a lot of content for families and adults during primetime, soap operas for adults in the daytime, and cartoons for children on Saturday mornings, there was relatively little content directed specifically at teenagers and pre-teens. The network saw an opportunity to fill that gap by airing topical specials midweek, when parents watching General Hospital might leave the television on and stick around to watch some TV with their adolescent children.

Initially, the network solicited a mix of fanciful stories and serious, issue-based melodramas. In the animated Incredible, Indelible, Magical Physical Mystery Trip, two kids were shrunk down to the size of a cell to travel through their uncle’s body. In Follow the Northern Star, a boy ushers a friend through the Underground Railroad to escape slavery.

 

Not long after the series debuted in the fall of 1972, ABC executives—including Brandon Stoddard, who was initially in charge of the show and was later responsible for getting the landmark 1977 miniseries Roots and David Lynch's quirky Twin Peaks onto the air—realized that the more puerile stories may have been working against them.

According to Martin Tahse, a producer on dozens of these specials, it was rare for older teens to watch programming intended for younger children. Pre-teens, on the other hand, would watch content meant for an older audience. By season three, the specials were largely made up of topical content. In The Skating Rink, a teen skater overcomes shyness borne out of stuttering. In The Bridge of Adam Rush, a teen copes with a cross-country move after his mother remarries.

The ABC Afterschool Special was an immediate hit, drawing an average of 9.4 million viewers between 1972 and 1974. Many episodes were based on young adult novels, like Rookie of the Year, which stars Jodie Foster as a girl struggling to find acceptance on a boys’ Little League team, or Sara’s Summer of the Swans, about a young woman searching for her missing, mentally challenged brother.

The series also sourced material from magazine articles, short stories, and other venues. For 1983’s The Wave, which originally aired on ABC in primetime in 1981, the story of a high school teacher who describes fascism and Hitler’s rise to power by successfully convincing his students to subscribe to a dictatorial rule, was based on the real experiences of Palo Alto teacher Ron Jones.

The effect of the topical episodes could be potent. For a 1985 special titled One Too Many, which starred Val Kilmer as an underage drinker and Michelle Pfeiffer as his girlfriend, one viewer wrote in to the Los Angeles Times to explain how the show had impacted her:

After watching the ABC Afterschool Special titled One Too Many, a story of drinking and driving, I realized I have taken too many chances with my life. I always think I can handle myself and my car after I’ve had something to drink. Nothing has happened to me … yet. I’d like to thank ABC for showing a program that could possibly save the lives of my friends and me. I’ve realized that drinking and driving is not worth the price of life.

 

As Tahse explained to interviewer Kier-La Janisse, the specials resonated with kids because they rarely indulged in what could be considered a fairy tale ending. “It had to be real,” he said. “If kids watched any of my three specials dealing with alcoholic parents, they were never given a fairy tale ending. I saw to that, because I came from an alcoholic father and knew all the tricks and I wanted the kids who watched—many dealing with the same problem or having friends who had alcoholic parents—to know how it really is.”

The shows also picked up their share of awards. One installment, the self-explanatory Andrea’s Story: A Hitchhiking Tragedy, won five Daytime Emmys in 1984, a third of all the Daytime Emmys ABC won that year. A Special Gift, a 1979 show about a basketball player who takes up ballet, won a Peabody Award.

By the mid-1980s, the specials attempted to strike more of a balance between morality plays and lighthearted fare. The 1984-1985 season consisted of seven episodes, including three comedies and one musical. In The Almost Royal Family, Sarah Jessica Parker stars as a teen whose family buys a home outside the jurisdiction of Canada and the U.S. In Mom’s on Strike, an overworked mother decides to suspend her duties until her family can appreciate her contributions.

Gradually, the specials began leaning back toward hot-button topics. Oprah Winfrey’s Harpo Productions took over producing the series in 1991. That season, Winfrey introduced the episodes, including two panel discussions about relationships and race relations. Though the series did revert back to fictional narratives, it gradually lost its footing in the wake of shows that had a more adolescent bent. A “Very Special Episode” of Beverly Hills, 90210 or Family Matters was essentially a stealth afterschool special. The series was canceled in 1996.

That the show endured for nearly a quarter of a century is a testament to the craftsmanship of producers like Tahse and the support of ABC, who rarely shied away from difficult topics. Still, Tahse—who died in 2014—believed that the series' broad appeal went beyond that.

“The only rule of storytelling that ABC required we follow was … the kid always had to figure out what to do and do it,” he said. “No finger-waving by parents, no lectures by parents. It was a kid who was in a situation and found, through his or her own efforts, a solution.”

SECTIONS

arrow
LIVE SMARTER