Jug Life: A History of the Kool-Aid Man

Kraft
Kraft

When Robert Skollar joined the General Foods marketing team at Grey Advertising in 1988, it didn’t take him long to realize that there were certain perks that came with the job. As the executive behind the Kool-Aid ad campaign, Skollar inherited the Kool-Aid Man, the anthropomorphic pitcher of sugar water that had been a staple of the brand for more than a decade.

Two stories stand out: The first, Skollar says, is when he was working late one night and decided to try on the Kool-Aid Man’s fiberglass costume for himself. It was like being inside a Christmas ornament. “It’s hard to hear anything in there,” Skollar tells Mental Floss. “You just hope you don’t fall down.”

The second was when Skollar got caught up in the trend of New York professionals putting on elaborate birthday parties for their kids. Skollar asked Richard Berg, the voice of Kool-Aid Man’s “Oh, Yeah!” catchphrase, to actually wear the costume for a personal appearance at his son’s sixth birthday party. (Normally, Berg just recorded the line.) “It was the voice in the costume, which was a first,” Skollar says. “And half the kids were frightened to death.”

Fortunately, that was hardly the typical reaction. Introduced in 1975, Kool-Aid Man became one of the most beloved characters in advertising history, with a recognition factor that sometimes outpaced that of Ronald McDonald. He got his own video game, his own comic book, and his own museum display in Hastings, Nebraska.

Not bad for someone who started out as a disembodied head.

By the time advertising executive Marvin Potts created a sentient pitcher of Kool-Aid in 1954, the powdered soft drink mix had been on shelves for 27 years. Conceived by Edwin Perkins in Hastings, Nebraska, as an alternative to glass bottle drinks—which were expensive to ship—what was then known as “Kool-Ade” became a cheap, popular way to flavor water.

When Perkins sold the brand to General Foods in 1953, their contracted advertising firm of Foote, Cone & Belding trialed a few different television spots. Potts’s idea—a large, bulbous container of Kool-Aid with an animated mouth and eyes named Pitcher Man—was the most popular. (Company lore says Perkins came up with the idea after watching his kid draw a smiley face on the condensation of a window.)

In the 1960s, Kool-Aid opted for celebrity spokespeople like The Monkees and Bugs Bunny, relegating Pitcher Man to the sidelines. “I think they found out Bugs was overwhelming the whole campaign,” Skollar says. “Kids would remember him but forget the ad was for Kool-Aid.”

That ceased to be a problem in 1975, when Alan Kupchick and Harold Karp at Grey Advertising developed the idea for Kool-Aid Man, an evolution of Pitcher Man. His face stopped moving, but the addition of arms and legs gave the character a more bombastic personality. It also allowed him to commit sensational acts of property destruction.

Skollar recalls that the iconic breaking-through-the-wall sequence wasn’t necessarily planned. “From what I’ve heard, someone on set said that Kool-Aid Man really had to make an entrance, and someone else, maybe a producer, suggested he come through the wall.” Breakaway bricks were set up, and the character's fiberglass shell—“the same material used for a Corvette Stingray,” Skollar says—effectively became a wrecking ball.

Although he was never officially named Kool-Aid Man at the time, the mascot helped propel sales of the drink mix. “It was a phenomenon,” Skollar says. “Here you had this 50-year-old product that’s not really convenient and not particularly healthy, and it’s huge.”

As Kool-Aid Man’s star grew, so did his opportunities to branch out. The property got its own Marvel comic—The Adventures of Kool-Aid Man—as well as an Atari 2600 video game. The latter could be redeemed with 125 points earned from purchasing Kool-Aid, which amounts to about 62.5 gallons of sugar water. (You could also send $10 with 30 points.)

When Skollar was handed control of the campaign in 1988, the advice was pretty clear. “It was basically: Don’t screw it up,” he says, “and make it more contemporary.”

Skollar says he took inspiration from Pee-wee’s Playhouse and the Peter Gabriel music video for "Sledgehammer" to conceive of an entire Kool-Aid Man universe—one bursting with frenetic activity that kids would find exciting and adults would find impenetrable.

“Most kid ads had a storyline at the time,” he says. “This didn’t. It was just surreal.”

This Lynchian Kool-Aid Man was no longer 7 years old, as previous marketing campaigns had implied, but 14 years old—old enough to play guitar and surf. Once naked, he now sported jeans and cool shirts. Skollar believes that the kinetic spots helped usher in a new wave of kid advertising that relied more on visceral, MTV-style cuts.

Not all of Kool-Aid’s efforts were focused on hyperactive kids, however. The drink mix was not without its controversies, having once been associated with the Jonestown massacre in 1978, where cult leader Jim Jones coerced his followers into drinking Kool-Aid and Flavor Ade laced with cyanide. There was also the matter of Kool-Aid suggesting gobs of sugar be added to the drink for flavor.

“We did a campaign targeted to moms, ‘Having Kids Means Having Kool-Aid,’” Skollar says. “And we told them they could control the amount of sugar they used. We also pushed that Kool-Aid had Vitamin C.”

Under Skollar, Kool-Aid sales shot to third place in the soft drink category—behind only Coke and Pepsi.

Kool-Aid Man makes an appearance at the NASDAQ
Slaven Vlasic/Getty Images

Skollar stayed on the Kool-Aid campaign through 1994, at which point the account was passed to Ogilvy & Mather. Eventually, the fiberglass costume became nylon and computer effects began to enhance his features.

CG was something Skollar had already started to experiment with, but eventually discarded it for the analog outfit. “There was something about that rawness, that awkward-looking pitcher breaking through walls,” he says.

One of the original costumes from 1975 sits in the Hastings Museum of Natural and Cultural History in Hastings, Nebraska, a testament to the character’s enduring appeal. Skollar says he once had research data supporting the fact that over 90 percent of kids could recognize Kool-Aid Man on sight.

The same wasn’t necessarily true of adults. “I remember one time we were shooting an ad where Kool-Aid Man was walking over a hill at sunset, holding hands with a little girl,” he says. “And a junior brand executive taps me on the shoulder and says, ‘We can’t see his face. How will we know who he is?’”

When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

When Mr. Rogers Taught Kids About Mutually Assured Nuclear Destruction

Focus Features
Focus Features

After months of hype, the ABC television network premiered a made-for-TV film titled The Day After on November 20, 1983. Presented with minimal commercial interruption, the two-hour feature illustrated a world in which both the United States and Russia made the cataclysmic decision to launch nuclear missiles. The blasts wiped a small town off the face of the Earth; the few who did survive writhed in pain, with their skin hanging off in clumps.

The imagery was graphic and unsettling, and it was supposed to be. Director Nicholas Meyer wanted to portray the fallout in sober detail. The Day After drew a sizable viewership and was hailed as a responsible use of television in order to educate audiences about the reality of the tension between the world’s superpowers.

In the weeks before the film premiered, though, another prominent broadcast was exploring the same themes. It was intended for young audiences and explored—via the use of puppets—the consequences of international aggression. For five episodes across one week, the threat of nuclear annihilation was looming in Mister Rogers’ Neighborhood.

A nuclear explosion creates a mushroom cloud
iStock.com/RomoloTava-ni

Since its inception on Pittsburgh's WQED in 1968, Mister Rogers’ Neighborhood had informed its young audience about topical issues in subversive and disarming ways. When civil rights were discussed, host Fred Rogers didn’t deliver a lecture about tolerance. Instead, he invited a black friend, Officer Clemmons, to cool off in his inflatable pool, a subtle nod to desegregation. In 1981, Rogers—the subject of this year's critically-acclaimed documentary, Won't You Be My Neighbor?explored the topic of divorce with puppet Patty Barcadi, whose parents had separated. Rogers comforts Prince Tuesday, who frets his own parents might split. Famously, Rogers also explored the subject of individuals with disabilities with the introduction of Jeff Erlanger, who became a quadriplegic at a young age after undergoing spinal surgery to remove a tumor. (Decades later, the two were reunited when Erlanger made a surprise appearance as Rogers was being inducted into the Television Academy Hall of Fame.)

Despite Rogers's history tackling tough topics, there was perhaps no greater a hot-button issue for the children’s show to tackle than nuclear war. Rogers wanted to address what he felt was a growing concern among schoolchildren who processed Cold War headlines and interpreted tensions between Russia and the U.S. as potentially disastrous. (In one survey of classrooms across several major cities, students labeled the possibility of nuclear war “likely.”)

Rogers conceived and taped a five-episode storyline on the subject in the summer of 1983, which wound up being prescient. In November 1983, president Ronald Reagan ordered the invasion of Grenada to topple a Marxist regime.

“Little did I know we would be involved in a worldwide conflict now,” Rogers told the Associated Press. “But that’s all the better because our shows give families an opportunity for communication. If children should hear the news of war, at least they have a handle here, to assist in family communications.”

In the five-part series titled “Conflict,” Rogers again turned to the puppets that populated his Neighborhood of Make-Believe. Provincial ruler King Friday (voiced by Rogers) is handed a “computer read-out” that tips him off to some counterintelligence: Cornflake S. Pecially, ruler of the neighboring land of Southwood, is allegedly making bombs. In a panic, King Friday orders his underlings to do the same, mobilizing efforts to make certain they can match Southwood’s fiery super weapons—even if it means not having the financial resources to care for his people in other ways.

Lady Elaine Fairchilde and Lady Aberlin aren’t quite convinced. Rather than succumb to paranoia, they decide to travel to Southwood to see for themselves. They find its citizens building a bridge, not a bomb. A misunderstanding had almost led to unnecessary violence.

Of course, no mushroom clouds envelop the Neighborhood of Make-Believe, and none of the puppets suffer the devastating effects of radiation poisoning. Rogers wasn’t even claiming the story was necessarily about war, but the prevention of it.

“This show gives us a chance to talk about war, and about how it’s essential that people learn to deal with their feelings and to talk about things and resolve conflicts,” he said.

A publicity photo of Fred Rogers for 'Mr Rogers' Neighborhood'
Getty Images

The episodes sparked conversation in classrooms, where some teachers used the footage to broach the subject. At an elementary school in Venetia, Pennsylvania, students in a third-grade social studies class discussed the consequences of war. “No water” was one response. “Injuries” was another.

Unlike The Day After, which one psychiatrist declared as inappropriate for children under 12, Rogers proved it was possible to provoke conversation without rattling any nerves.

Following their initial run in 1983, the five-part “Conflict” episodes have never been repeated. The close of the 1980s saw a reduction in concerns over nuclear attacks, and it’s possible producers of Mister Rogers’ Neighborhood regarded the shows as dated.

They resurfaced briefly on YouTube in 2017 before vanishing. The series was subsequently uploaded to a Dailymotion video account in 2018. Like The Day After, the shows are an interesting time capsule of an era when the fear of devastating conflict was palpable. For a number of kids who experienced that concern, Mr. Rogers helped frame it in a way they could understand.

“I don’t want this to be a frightening thing,” Rogers said. “I want children to know that war is something we can talk about. Whatever is mentionable is manageable.”

SECTIONS

arrow
LIVE SMARTER