Candy Crush: The Bizarre History of Those '90s Mentos Commercials

MathUser13 via YouTube
MathUser13 via YouTube

In the fall of 1996, Liam Killeen walked into a convenience store in Erlanger, Kentucky, near the U.S. offices of Van Melle (now known as Perfetti Van Melle), the candymaker behind Mentos. While paying for his purchase, Killeen—Mentos's vice president of marketing—noticed that the cashier was eyeing him with a mixture of suspicion and disgust.

Killeen asked if there was a problem. She pointed to the Mentos logo on the company jacket he was wearing. “Mentos?” she spat. “I hate those commercials. They’re so cornball! So stupid!”

It was not the first time Killeen had heard such a strong, visceral reaction to the ad campaign he helped devise. Beginning in 1992, the Netherlands-based confectioners had stormed the States with a series of inexplicably odd television spots that featured an earwig of a song (“Fresh Goes Better”), hammy acting, and a general sense that the ads were trying to approximate American culture rather than actually be a part of it—not unlike a robot mimicking the emotions of its human counterparts.

Some people loved the ads; a lot of people didn’t. (In 1994, USA Today voted it one of the worst advertising campaigns.) But the ads did what they were supposed to do. In 1991, Van Melle sold $20 million worth of the hard candies. In 1994, that number doubled to $40 million. By 1996, it had tripled to $120 million. By either design or accident, Mentos became a leader in the sweets industry by producing commercials that were almost incomprehensibly stupid.

It was during a train ride to Poland in 1932 that brothers Michael and Pierre van Melle originally had the inspiration to develop and market a peppermint-flavored caramel candy. Calling the bite-sized pieces Mentos, Van Melle began exporting them in the 1950s; in 1972, Mentos arrived stateside.

With minimal marketing and little name recognition, Mentos were largely lost in candy aisles, selling modestly for nearly 20 years. Around the time Killeen joined the U.S. sales office in 1991, the decision was made to begin a more aggressive grab for market share. First, Mentos would reduce the number of available flavors from 50 to just two: mint and mixed fruit. Second, they would pursue a global ad campaign marketed directly to consumers instead of the trade ads Van Melle had typically produced for candy distributors and suppliers.

Ad agency Pahnke & Partners out of Hamburg, Germany was enlisted to conceptualize the spots, which had several recurring themes: A young, attractive couple would find themselves in some sort of bind that would usually be remedied by popping a Mentos and subsequently having a spark of inspiration. One of the leads would hold up the Mentos package and give a thumbs up. Throughout, a song would play that was intended to sound catchy no matter where in the world the commercials were airing.

In one spot, a man decides to don a tablecloth and pretend to be a waiter in order to garner better service. In another, traffic impedes two lovers from embracing. At the climax of the 30-second spots, a brand slogan—“The Freshmaker”—would appear onscreen.

Viewers who spotted the ads when they premiered in July 1992 were driven to distraction by one intangible: The ads seemed disconnected from actual human behavior, and the song itself was critiqued for appearing to be an English translation that didn’t get the lyrics quite right. (“It doesn’t matter what comes, fresh goes better in life.”)

By the mid-1990s, both news media and the burgeoning world of the internet had become preoccupied with the unreality of Mentos. Much of the speculation revolved around whether the commercials were shot in the U.S. or elsewhere. (According to the company, three of the commercials were shot in the States, while seven were produced overseas.) An early "Mentos FAQ" was set up by Purdue University student Heath Doerr, who pored over minutiae in a way that would foreshadow the obsessive online fan cultures that followed.

Van Melle recognized a phenomenon when they saw it and rarely responded to media requests for information about the campaign. "It's almost more fun to have consumers off on their own," Mentos brand manager Tricia Gold told The New York Times in 1995. "If we added our input, it would stop the free flow of information."

People could mock and inspect the ads all they wanted. For Van Melle, the curiosity led to brand awareness that couldn’t have been obtained purely through ad buys. By 1996, Mentos had reached $135 million in sales and was being mentioned or parodied in a number of high-profile spots. The Foo Fighters released a video, “Big Me,” which mocked the cheesiness of the ads; the candy was name-dropped in 1995’s Clueless; the brand got sustained exposure during an entire season of Baywatch.

The novelty began to wear off around 1999, when Mentos's sales had leveled despite major growth in what Ad Age dubbed the “strong mint category” of treats. Altoids was eating into market share, and Mentos-sponsored college concerts weren’t making much of a dent. After roughly a decade of near-constant rotation, the Freshmaker campaign began to settle down in 2002. Despite their reduced role in popular culture, Mentos remain a top-ranked mint in the confection business.

Jesse Peretz, who directed the Foo Fighters's parody video, may have summed up Mentos mania best. “The commercials,” he told Entertainment Weekly, “are total lobotomized happiness.”

When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

When Mr. Rogers Taught Kids About Mutually Assured Nuclear Destruction

Focus Features
Focus Features

After months of hype, the ABC television network premiered a made-for-TV film titled The Day After on November 20, 1983. Presented with minimal commercial interruption, the two-hour feature illustrated a world in which both the United States and Russia made the cataclysmic decision to launch nuclear missiles. The blasts wiped a small town off the face of the Earth; the few who did survive writhed in pain, with their skin hanging off in clumps.

The imagery was graphic and unsettling, and it was supposed to be. Director Nicholas Meyer wanted to portray the fallout in sober detail. The Day After drew a sizable viewership and was hailed as a responsible use of television in order to educate audiences about the reality of the tension between the world’s superpowers.

In the weeks before the film premiered, though, another prominent broadcast was exploring the same themes. It was intended for young audiences and explored—via the use of puppets—the consequences of international aggression. For five episodes across one week, the threat of nuclear annihilation was looming in Mister Rogers’ Neighborhood.

A nuclear explosion creates a mushroom cloud
iStock.com/RomoloTava-ni

Since its inception on Pittsburgh's WQED in 1968, Mister Rogers’ Neighborhood had informed its young audience about topical issues in subversive and disarming ways. When civil rights were discussed, host Fred Rogers didn’t deliver a lecture about tolerance. Instead, he invited a black friend, Officer Clemmons, to cool off in his inflatable pool, a subtle nod to desegregation. In 1981, Rogers—the subject of this year's critically-acclaimed documentary, Won't You Be My Neighbor?explored the topic of divorce with puppet Patty Barcadi, whose parents had separated. Rogers comforts Prince Tuesday, who frets his own parents might split. Famously, Rogers also explored the subject of individuals with disabilities with the introduction of Jeff Erlanger, who became a quadriplegic at a young age after undergoing spinal surgery to remove a tumor. (Decades later, the two were reunited when Erlanger made a surprise appearance as Rogers was being inducted into the Television Academy Hall of Fame.)

Despite Rogers's history tackling tough topics, there was perhaps no greater a hot-button issue for the children’s show to tackle than nuclear war. Rogers wanted to address what he felt was a growing concern among schoolchildren who processed Cold War headlines and interpreted tensions between Russia and the U.S. as potentially disastrous. (In one survey of classrooms across several major cities, students labeled the possibility of nuclear war “likely.”)

Rogers conceived and taped a five-episode storyline on the subject in the summer of 1983, which wound up being prescient. In November 1983, president Ronald Reagan ordered the invasion of Grenada to topple a Marxist regime.

“Little did I know we would be involved in a worldwide conflict now,” Rogers told the Associated Press. “But that’s all the better because our shows give families an opportunity for communication. If children should hear the news of war, at least they have a handle here, to assist in family communications.”

In the five-part series titled “Conflict,” Rogers again turned to the puppets that populated his Neighborhood of Make-Believe. Provincial ruler King Friday (voiced by Rogers) is handed a “computer read-out” that tips him off to some counterintelligence: Cornflake S. Pecially, ruler of the neighboring land of Southwood, is allegedly making bombs. In a panic, King Friday orders his underlings to do the same, mobilizing efforts to make certain they can match Southwood’s fiery super weapons—even if it means not having the financial resources to care for his people in other ways.

Lady Elaine Fairchilde and Lady Aberlin aren’t quite convinced. Rather than succumb to paranoia, they decide to travel to Southwood to see for themselves. They find its citizens building a bridge, not a bomb. A misunderstanding had almost led to unnecessary violence.

Of course, no mushroom clouds envelop the Neighborhood of Make-Believe, and none of the puppets suffer the devastating effects of radiation poisoning. Rogers wasn’t even claiming the story was necessarily about war, but the prevention of it.

“This show gives us a chance to talk about war, and about how it’s essential that people learn to deal with their feelings and to talk about things and resolve conflicts,” he said.

A publicity photo of Fred Rogers for 'Mr Rogers' Neighborhood'
Getty Images

The episodes sparked conversation in classrooms, where some teachers used the footage to broach the subject. At an elementary school in Venetia, Pennsylvania, students in a third-grade social studies class discussed the consequences of war. “No water” was one response. “Injuries” was another.

Unlike The Day After, which one psychiatrist declared as inappropriate for children under 12, Rogers proved it was possible to provoke conversation without rattling any nerves.

Following their initial run in 1983, the five-part “Conflict” episodes have never been repeated. The close of the 1980s saw a reduction in concerns over nuclear attacks, and it’s possible producers of Mister Rogers’ Neighborhood regarded the shows as dated.

They resurfaced briefly on YouTube in 2017 before vanishing. The series was subsequently uploaded to a Dailymotion video account in 2018. Like The Day After, the shows are an interesting time capsule of an era when the fear of devastating conflict was palpable. For a number of kids who experienced that concern, Mr. Rogers helped frame it in a way they could understand.

“I don’t want this to be a frightening thing,” Rogers said. “I want children to know that war is something we can talk about. Whatever is mentionable is manageable.”

SECTIONS

arrow
LIVE SMARTER