When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

A Timeless History of the Swatch Watch

Jeff Schear, Getty Images for Swatch
Jeff Schear, Getty Images for Swatch

A curious sight surrounded retail watch counters in the 1980s and early 1990s. The crowds that gathered as salespeople put new Swatch watches out for purchase resembled something out of the Cabbage Patch Kid craze of just a few years earlier. Shoppers would jostle one another in the hopes of scoring one of the $30 plastic timepieces, which came in a variety of colors and designs. The demand was such that sellers often set a one-watch-per-customer limit.

That’s where the odd behavior came in. Customers would buy a Swatch, leave, then return—this time in a different set of clothes or even a wig in an effort to overcome the allocation and buy a second or third Swatch. The watches were the fashion equivalent of Beanie Babies, though even that craze didn’t quite reach the heights of needing a disguise. Limited-edition Swatches were coveted by collectors who had failed in their pursuit at the retail level and paid thousands for them on the aftermarket. The accessories simultaneously became a fashion statement and an artistic canvas.

More importantly, they also became the savior of the Swiss watch industry, which had been on the verge of collapse.

A person models a Swatch watch on their wrist
Tasos Katopodis, Getty Images for Soho House Chicago

To understand the unique appeal of Swatch, it helps to size up the landscape of the timepiece category in the late 1970s. Swiss watches, long considered the gold standard of timepieces, were being outpaced by quartz-powered digital imports from Japan that were cheap to produce and cheap to sell. Faced with the choice of buying a quality watch for a premium price or opting for a bargain digital model, an increasing number of consumers were choosing the imports. Business was down, factories were closing, and jobs were being lost.

Fortunately, a number of things were happening that would prove to offer salvation for the Swiss. ETA SA, a company that made watches and was headed up by Ernst Thomke, had recently invested in an injection-molding machine at the behest of engineer Elmar Mock. Mock, along with his colleague Jacque Muller, spent 15 months crafting a plastic prototype watch that was one piece and welded together. The significance of a sealed unit was that it economized the entire process, turning watches from handcrafted units to models that could be produced by automation. The watches required just 51 parts instead of the 91 pieces typical of most models at the time. In this way, Thomke, Mock, and Muller had produced a timepiece that was both durable and inexpensive.

The issue was why someone might opt for a Swatch watch over a digital Japanese model. Thomke knew that the idea of a “Swiss watch” still held wide appeal in the same way someone might opt for a real Chicago deep-dish pizza over an imitator’s version. Along with Nicholas Hayek, who later became CEO of the Swatch Group, Thomke believed he had cracked the code for a Swiss watch renaissance. He released the first Swatch in Zurich in March of 1983.

But the manufacturing process that allowed Swatches to come in at a reasonable price was also a problem. Automating the process meant the watches and bands were almost always identical in size and shape. If the watch’s general appearance couldn’t be changed, how could it stand out?

A selection of Swatch watches are seen on display
Anthony Kwan, Getty Images

The answer was in the design. The Swatch name came from a contraction of two words: secondary watch. The idea was that a watch could be analogous to a necktie or other fashion accessory. No one owned just one tie, scarf, or pair of dress shoes. They typically had a rotation. Thomke and Hayek didn't believe a watch should be any different.

At the behest of marketing consultant Franz Sprecher, Swatches were soon flooding stores in an assortment of colors and with different designs on the face of the timepiece itself. They could be coordinated for different outfits or occasions, a practice that became known as “watch wardrobing." Someone who bought a red Swatch for summer lounging might opt for a black Swatch as part of their professional attire. The watches retailed for $30 to $40 apiece, so buying more than one was financially feasible.

That was the concept, anyway. Some U.S. retail stores received their Swatch inventory and didn’t know what to make of what was—on the surface—a cheap plastic watch. Neither did their customers.

What Swatch needed was a marketing plan. That largely fell into the hands of marketing consultant Max Imgruth, who was named president of the company’s American division. Swatch saw their sales rise from $3 million in 1984 to $105 million in 1985. Thanks to an effective advertising campaign and more eclectic color choices, public perception of Swatches put them firmly in the fashion category.

A selection of Swatch watches designed by artist Keith Haring are seen on display
Anthony Kwan, Getty Images

The approach opened up a new market, one Thomke, Hayek, and their colleagues had not quite anticipated: Collectors were rabid about Swatches.

To keep their biannual collections of 22 to 24 watch releases fresh, Swatch began recruiting a number of collaborators to design extremely unique offerings. In 1984, they enlisted artist Kiki Picasso to design a series. The following year, Keith Haring designed his own collection. In a kind of prelude to the sneaker design phenomenon of the 1990s and beyond, these collaborators put their own distinctive stamps on the Swatches, which acted as a kind of canvas for their artistic expression.

Between third-party designers and contributions from Swatch’s Milan, Italy, design team, collectors couldn’t get enough. There was the Swatchetables line, which imagined the Swatches in a series of food-related motifs—a red-hot chili pepper Swatch, a cucumber Swatch, and a bacon-strap and egg-faced Swatch. The entire set sold for $300 and only at select food markets, quickly shooting up to $2400 in the secondary market. (Like all aftermarket Swatches, they needed to be kept in their plastic retail case in order to realize their full value.) Some resellers bought up stock in New York, then resold them for three times the price in Italy.

The 1985 “Jellyfish” model was transparent. The 1989 “Dadali” had a face with Roman numerals that appeared to be melting off the face and onto the strap. Swatches came with cuffs to honor Mozart or adorned with synthetic fur. There were Mother’s Day editions and editions celebrating the 200th anniversary of the French Revolution. Some of the straps were scented.

A selection of Swatch watches are seen on display
Anthony Kwan, Getty Images

The possibilities were endless, and so was the consumer appetite. (Except for yellow straps, which traditionally sold poorly.) Collectors camped out for Swatches at retailers or hundreds of Swatch-exclusive stores around the country. Affluent collectors dispatched employees to different retailers in the hopes of finding a limited-edition watch for retail price. If they failed, some had no problem paying thousands of dollars at auction. A Kiki Picasso Swatch, one of a very limited 121 pieces total, sold for $28,000 in 1992.

Though no one wears disguises to acquire Swatch watches anymore, the company is still issuing new releases. And while the company has seen a decline in sales over the years—the rise of smartwatches like the Apple Watch and Fitbit continue to eat into their marketing share—affection for the brand is unlikely to disappear entirely anytime soon. In 2015, one of the world’s largest collections of Swatches—5800 pieces—went up for sale, and ultimately fetched $6 million.

The Rise, Fall, and Resurgence of the Fanny Pack

Matt Cowan, Getty Images for Coachella
Matt Cowan, Getty Images for Coachella

Back in 1954, Sports Illustrated ran an advertisement for a leather pouch that was touted as an ideal accessory for cross-country skiers who wanted to hold their lunch and ski wax. Hikers, equestrians, and bicyclists could also benefit from this waist-mounted sack, which was a bit like a backpack situated on the hips.

The “fanny pack” sold for $10 ($95 today). For the next several decades, it remained popular among recreational enthusiasts traveling by bike, on foot, or across trails where hands could be kept free and a large piece of travel luggage was unnecessary. From there, it morphed into a fashion statement, marketed by Gucci and Nike for decorative and utilitarian purposes in the 1980s and '90s, before becoming an ironic hipster joke. Even the name—fanny pack—suggests mirth. But the concept of carrying goods on top of your buttocks was never meant to be a joking matter.

A man sports a ski outfit with a fanny pack in 1969
McKeown/Daily Express/Hulton Archive/Getty Images

Mankind has looked to belt-mounted storage solutions for centuries. Ötzi the Iceman, a 5300-year-old mummy found preserved in a glacier in 1991, had a leather satchel that held a sharpened piece of bone and flint-stone tools. Subsequent civilizations adopted the premise, with Victorian and Edwardian women toting chatelaine purses made of silk or velvet.

The 20th-century obsession with the fanny pack seemingly began on the ski slopes in Europe in the 1960s and '70s. Known as bauchtasche, or stomach bags, in Switzerland, skiers traveling away from the base lodge who wanted to keep certain items—food, money, a map, flares, and occasionally alcohol—within arm's reach wore them proudly. Photographers also found them useful when hiking or traveling outdoors and climbing through obstacles, as they reduced the risk of an expensive camera or lens being dropped or damaged.

Their migration into fashion and the general public happened in the 1980s, due to what Fashion Fads Through American History author Jennifer Grayer Moore dubbed the rise of “athleisure.” This trend saw apparel and accessories typically relegated to sports or exercise—think leggings, track suits, and gym shorts—entering day-to-day use. With them came the fanny pack, a useful depository for keys, wallets, drinks, and other items. They were especially popular among tourists, who could stash travel accessories like cameras and souvenirs without burdening themselves with luggage.

In the late 1980s, fashion took notice. High-end labels like Chanel manufactured premium fanny packs, often with the more dignified name of belt bag. Sporting one was considered cool, as evidenced by their presence in popular culture. The Fresh Prince, Will Smith, wore one. Members of New Kids on the Block were seen with them. Nothing, it seemed, could dissuade people from feeling pragmatic and hip by sporting an oversized pocket on their waist, which they typically pulled to the front.

A model sports a fanny pack, also known as a belt bag, across her shoulder
Hannah Peters, Getty Images

Like most trends, overexposure proved fatal. Fanny packs were everywhere, given out by marketing departments of major brands like Miller Beer and at sports arenas and stadiums. Plastered with corporate logos, they became too crassly commercial for style purposes and too pervasive. By the end of the 1990s, wearing a fanny pack was no longer cool. It was an act that invited mockery and disdain.

The pack, of course, has retained its appeal among outdoor enthusiasts, and lately has been experiencing a resurgence in style circles, with designer labels like Louis Vuitton and Valentino offering high-end pouches. Many are now being modified or worn across the torso like a bandolier (like so), an adaptation prized by skateboarders who want something to hold their goods without hindering movement.

In 2018, fanny packs were credited with a surge in overall accessories sales, posting double-digit gains in merchandise. The fanny pack may have had its day as an accessory of mass appeal, but it’s not likely to completely disappear anytime soon.

SECTIONS

arrow
LIVE SMARTER