Original image

10 Interesting Numbers in American Culture (Plus or Minus a Few)

Original image

From the number that defeated the Nazis to the one that put a smile on the faces of drunken sailors, here are 10 digits with real value.

1. Nine-tenths of a cent: The fraction that makes us pump more gas

Every time we fill up our tanks, we wrestle with one of life's thorniest mysteries: Why do gas prices end in 0.9 cents? Unfortunately, the origins of the increment are murky. Some sources attribute the practice to the 1920s and 1930s, when the gasoline tax was nine-tenths of a cent.

Stations would simply slap the extra 0.9 onto the advertised price of a gallon to give Uncle Sam his cut. Others theorize that slashing 0.1 cent off the price undercut competitors back in the days when gas was just a few cents per gallon.

Although most drivers simply ignore the extra 0.9 cents, oil companies certainly don't. In 2009, Americans consumed 378 million gallons of gas per day, and that extra 0.9 cents per gallon was collectively worth nearly $3.5 million a day. On the flip side, you could also argue that customers collectively saved around $340,000 per day, thanks to stations' reluctance to round up to the next penny.

2. 2.3 milligrams of B1: The recommendation that won a war

Food nutrition labels were originally designed to do a lot more than make you feel guilty about eating Cheetos. The dietary recommendations were created in the 1940s to help America accomplish one of the most important missions in its history -- defeating Hitler.

On the brink of entering World War II, U.S. military leaders discovered an unexpected problem. Our soldiers weren't only hungry for victory; they were just plain hungry. After screening some 1 million young men for potential service in the armed forces, the Selective Service discovered that about one in seven candidates suffered from "disabilities directly or indirectly connected with nutrition." The recruits were unfit for duty, and the nation needed a way to turn these malnourished men into Axis-pummeling Captain Americas.

The administration pounced on the problem. President Franklin Roosevelt gathered a committee of nutrition experts to create a practical diet that would keep Americans in shape -- both at home and while fighting abroad. Within months, the committee released its "Recommended Dietary Allowances" for each nutrient. For example, a "very active" man would need 2.3 mg of vitamin B1 per day, while a "very active" woman would need about 1.8 mg.

The system worked, and today, the recommendations have morphed into the nutrition labels now standard on packaged foods. Every few years, the numbers are revised and expanded to reflect new developments in nutrition science, and they've picked up the snazzy name "Dietary Reference Intakes." But don't be fooled by the titling. At their core, they're still the same recommendations that helped a nutrient-starved nation defeat the Nazis.

3. 55 mph: The speed that drove America crazy

During the oil crisis of the 1970s, the U.S. government was desperate to convince Americans to burn less gasoline. Realizing that cars are more fuel-efficient when driven at lower speeds, Congress decided to force people to drive slower. In 1974, it enacted a law that set the national speed limit at 55 mph, along with a threat: Any state that didn't comply with the rule would lose its federal highway funding.

Congress may have set the speed limit, but it was up to individual states to enforce it -- and many states didn't appreciate being bossed around. In fact, some states made a mockery of the law. Nevada, for example, refused to write tickets to speeders unless they were caught traveling more than 70 mph; instead, offenders received laughable $5 "energy wasting" fines.

So, did the lowered speed limit actually accomplish its goal? The answer is still hotly debated. While the law did slash petroleum consumption by 167,000 barrels per day, the savings represented a drop in demand of only one or two percent. Highway fatalities also dropped significantly with the lower speed limit, though some analysts have theorized that this reduction was the result of a general decrease in recreational driving rather than slower speeds.

Nonetheless, both state governments and average citizens whined about the law so much that Congress bumped up the speed limit to 65 mph in 1987, then did away with the law completely in 1995, putting speed limits back in the hands of the states.

4. Five seconds: The rule that can make you sick

At some time or another, with or without witnesses present, we've all used the five-second rule to justify eating a cookie that's touched the floor. After all, everyone knows that if a tasty treat spends less than five seconds on the ground, it doesn't collect germs.

Well, not exactly. In 2003, high school student Jillian Clarke performed the first known scientific tests on the five-second rule. While interning at the food science laboratory at the University of Illinois at Urbana-Champaign, Clarke tested the theory by placing gummy bears and cookies on ceramic tiles contaminated with E. coli. Her results revealed bad news for clumsy snackers: The munchies picked up the bacteria within the five-second window. Clark's quirky experiment inspired other food researchers to further investigate the matter. One such scientist, Dr. Paul L. Dawson of Clemson University, showed that food actually follows a "zero-second rule," meaning that bacteria such as salmonella transfer onto food instantly upon contact.

Thankfully, the news isn't as dire as it sounds. In a follow-up set of experiments, Clarke tested the bacteria levels of the university's floors. Her team found very little contamination, even in the most highly trafficked areas of campus. As it turns out, most floors at the University of Illinois are so clean you can eat off of them.

5. $435: The price that humiliated the Pentagon

Back in the 1980s, there was one simple way to win any argument about wasteful government spending -- just bring up the Pentagon's infamous $435 hammer. The absurdly priced tool, which made headlines in 1983 following the publication of a federal spending report, became a popular symbol of government excess.

The truth, however, is more complicated. Sure, there were invoices that showed the Pentagon shelling out $435 a piece for hammers, but the documents were more of a testament to the government's odd accounting practices than its wastefulness. Per Pentagon accounting rules, defense contractors were expected to spread their overhead costs evenly across products to simplify bookkeeping. As a result, massive expenses for things such as research and development and factory maintenance were averaged into the costs of everyday office supplies. That meant that while super-expensive items such as missiles came in cheaper on the register, the price of small-ticket items such as hammers were distorted in the other direction. And because "Pentagon Gets Real Bargain on Missile!" makes a lousy headline, the media latched on to the $435 hammer story.

Since then, the Pentagon has changed its accounting rules, but it's still trying to live down the urban legend about the costly tools lurking in its overpriced toolbox.

6. 100 proof: The measurement that gets you drunk

Proof labels on alcohol bottles were born from the needs of sailors, who wanted assurances about the quality of their booze at sea. Beginning in 1731, members of the British Royal Navy were given an alcohol ration of half a pint of rum per day. (That practice continued, albeit with reduced quantities, until 1970.)

The men loved their rum, but they often became suspicious that their superiors were watering down the goods. To test the rum's potency, sailors would douse a small pile of gunpowder with the liquor and attempt to set it on fire. If the powder lit instantly, the sailors took it as "proof" that the rum was strong enough. But if the powder fizzled, the booze was deemed unfit to drink. Because spirits need to be at least 57.06 percent alcohol to combust, that threshold became known as "100 degrees proof."

The British system eventually made it across The Pond, where Americans simplified the idea by redefining "proof" as twice the percentage of alcohol by volume. Sure, it's not as visually impressive as the sailors' method, but it beats having to take a handful of gunpowder into a bar with you.

7. 1 in 195,249,054: Your Odds of Living on Easy Street

No matter how lucky you’re feeling, your odds of hitting the jackpot in the multi-state Powerball lottery are a don’t-spend-the-money 1 in 195 million. For perspective, your odds of being struck by lightning twice are much more reasonable, at 1 in 39 million.
Still, there are a few justifications for plunking down your hard-earned cash and crossing your fingers. For one thing, it puts you in terrific historical company. When the London Company had to scrape together funding for the Virginia colony in 1612, King James I authorized lotteries to raise capital. More than 150 years later, founding fathers Benjamin Franklin and George Washington ran lotteries to help finance the Revolutionary War and fund new infrastructure. The odds of winning weren’t great, but they beat taxation without representation.

Modern lottery players can’t brag that they’re backing George Washington, but their tickets still serve a civic duty. While dispersals of lottery funds vary across states, the games generally bolster schools’ coffers. For example, California sends its schools around 35 cents from every dollar of a ticket sold. These 35-cent increments add up; California’s schools have raked in more than $20 billion since the state’s lottery started in 1985.

Of course, as long as there’s been a lottery, there have been scoundrels trying to game the system. For the Powerball, cornering the market on the nearly 200 million potential combinations would be logistically impossible and risky. But that doesn’t mean smaller lotteries aren’t susceptible. In 1992, an accountant named Stefan Klincewicz put together a 28-person syndicate to buy up all 1.94 million potential combinations for the Irish lottery. Although lottery officials sniffed out the scheme and put a halt on ticket sales the day before the drawing, Klincewicz and his associates managed to snap up 80 percent of the available tickets. They walked away with roughly $1.8 million USD in winnings, and even though the crew had to split the loot and deduct expenses, they each turned a modest profit.

8. 6,894,200,000 people? The population we can't pin down

During the past century, we've really kicked our world-populating into high gear. In 1950, there were around 2.5 billion of us. Now that number is closer to 7 billion. How close? That's a question that plagues even the smartest thinkers. In order to know how many of us there will be in the future (and where to allocate program dollars to make sure those future folks are happy and healthy), we need to know how many of us there are right now.

Unfortunately, answering that question isn't as simple as lining up everyone for a head count. World-population estimates at any given moment are drawn from data collected in national censuses, but a country's census numbers might be several years old. Demographers can use that data to estimate current populations, but those calculations require assumptions about things like mortality, fertility, and migration rates. Additionally, a nation's census data isn't absolutely accurate even when it's fresh. The Chinese census, for instance, boasts a margin of error lower than two percent. That sounds great, until you realize that the discrepancy could represent as many as 27 million people -- or roughly one-and-a-half New York City metro areas -- who may or may not be living somewhere in China.

But none of these shortcomings stop groups from making bold proclamations. In October 12, 1999, the UN Population Fund symbolically named Bosnian baby Adnan Nevic the world's 6 billionth person. The U.S. Census Bureau snapped back, stating that Baby No. 6 Billion had probably been born four months earlier. Congrats to little Adnan's parents, though!

Thanks to all the assumptions required, future projections can vary wildly. In the past decade alone, UN demographers have estimated that the population will peak at 12 billion this century, only to later revise the estimate to 9 billion. With fluctuations like that, it's difficult to know what sort of population boom we should be bracing for.

9. The Dow at 14,165: The statistic that measures the health of our economy

Most Americans think of the Dow Jones Industrial Average as the canary of our financial coal mine. But what did it really mean when the Dow hit its record high of 14,165 in October 2007?

To answer that, you have to go back to Charles Dow, legendary newspaper mogul and co-founder of The Wall Street Journal. In 1896, Dow created the first version of the Dow Jones Industrial Average. The idea was to monitor the health of the business sector by tracking the performance of the country's 12 largest firms. The Dow was originally measured in dollars, and calculating it was a breeze; accountants just averaged the 12 stock prices. The first Industrial Average on record was $40.94. When the firms were doing well, that average went up; when they performed poorly, the Dow went down.

The measuring system has become more sophisticated over the years. The modern index includes 30 companies, and the Dow has to account for things like stock splits and spinoffs. Thanks to these adjustments, the Dow is now measured in points rather than dollars. A single dollar increase in any of its current members' share prices causes the Dow to rise by about seven points.

So, how does a company get into the Dow 30? It's a bit like rushing a financial fraternity. A three-person committee (which includes the managing editor of The Wall Street Journal) handpicks the companies, looking for stocks with strong reputations, solid growth, and interest from a broad pool of investors. Of the original 12 companies selected, only General Electric is still in the pool. In fact, the "industrial" in the average's name is a bit of a relic. The current incarnation of the Dow includes non-industrial companies such as American Express and The Home Depot. Still, by telling us how the biggest and most stable American companies are doing, the Dow remains one of the best indicators of the overall health of the U.S. economy.

10. 3.14159265 ...: The number that makes us all a little irrational

As the ratio of a circle's circumference to its diameter, pi is a mathematical constant. As an irrational number comprised of digits that will never repeat or terminate, pi is a constant source of amusement for math nerds of all stripes.

Computer programmers have even spent ridiculous amounts of time calculating pi out to its five trillionth decimal place (which is a 2, for the record).

If calculating decimal places isn't your idea of fun, you can always memorize them. The current unofficial world record belongs to Japan's Akira Haraguchi, who rattled off 100,000 decimal places in 2006. People who need help remembering digits often fall back on memorizing a "piem," a poem in which the number of letters in each word corresponds to pi's digits.

American mathematician Mike Keith's 2010 book Not a Wake (that's 3-1-4 letters, if you're counting at home) extends this exercise to 10,000 digits. If you start memorizing now, you'll be ready for next year's Pi Day, on March 14.

This article originally appeared in mental_floss magazine. If it put you in a subscribing mood, here are the details. Got an iPad or another tablet device? We also offer digital subscriptions through Zinio.

Original image
Getty Images
40 Fun Facts About Sesame Street
Original image
Getty Images

Now in its 47th season, Sesame Street is one of television's most iconic programs—and it's not just for kids. We're big fans of the Street, and to prove it, here are some of our favorite Sesame facts from previous stories and our Amazing Fact Generator.

Sesame Workshop

1. Oscar the Grouch used to be orange. Jim Henson decided to make him green before season two.

2. How did Oscar explain the color change? He said he went on vacation to the very damp Swamp Mushy Muddy and turned green overnight.

3. During a 2004 episode, Cookie Monster said that before he started eating cookies, his name was Sid.

4. In 1980, C-3PO and R2-D2 visited Sesame Street. They played games, sang songs, and R2-D2 fell in love with a fire hydrant.

5. Mr. Snuffleupagus has a first name—Aloysius

6. Ralph Nader stopped by in 1988 and sang "a consumer advocate is a person in your neighborhood."

7. Caroll Spinney said he based Oscar's voice on a cab driver from the Bronx who brought him to the audition.

8. In 1970, Ernie reached #16 on the Billboard Hot 100 with the timeless hit "Rubber Duckie."

9. One of Count von Count's lady friends is Countess von Backwards, who's also obsessed with counting but likes to do it backwards.

10. Sesame Street made its Afghanistan debut in 2011 with Baghch-e-Simsim (Sesame Garden). Big Bird, Grover and Elmo are involved.

11. According to Muppet Wiki, Oscar the Grouch and Count von Count were minimized on Baghch-e-Simsim "due to cultural taboos against trash and vampirism."

12. Before Giancarlo Esposito was Breaking Bad's super intense Gus Fring, he played Big Bird's camp counselor Mickey in 1982.

13. Thankfully, those episodes are available on YouTube.

14. How big is Big Bird? 8'2". (Pictured with First Lady Pat Nixon.)

15. In 2002, the South African version (Takalani Sesame) added an HIV-positive Muppet named Kami.

16. Six Republicans on the House Commerce Committee wrote a letter to PBS president Pat Mitchell warning that Kami was not appropriate for American children, and reminded Mitchell that their committee controlled PBS' funding.

17. Sesame Street's resident game show host Guy Smiley was using a pseudonym. His real name was Bernie Liederkrantz.

18. Bert and Ernie have been getting questioned about their sexuality for years. Ernie himself, as performed by Steve Whitmere, has weighed in: “All that stuff about me and Bert? It’s not true. We’re both very happy, but we’re not gay,”

19. A few years later, Bert (as performed by Eric Jacobson) answered the same question by saying, “No, no. In fact, sometimes we are not even friends; he can be a pain in the neck.”

20. In the first season, both Superman and Batman appeared in short cartoons produced by Filmation. In one clip, Batman told Bert and Ernie to stop arguing and take turns choosing what’s on TV.

21. In another segment, Superman battled a giant chimp.

22. Telly was originally "Television Monster," a TV-obsessed Muppet whose eyes whirled around as he watched.

23. According to Sesame Workshop, Elmo is the only non-human to testify before Congress.

24. He lobbied for more funding for music education, so that "when Elmo goes to school, there will be the instruments to play."

25. In the early 1990s, soon after Jim Henson’s passing, a rumor circulated that Ernie would be killed off in order to teach children about death, as they'd done with Mr. Hooper.

26. According to Snopes, the rumor may have spread thanks to New Hampshire college student, Michael Tabor, who convinced his graduating class to wear “Save Ernie” beanies and sign a petition to persuade Sesame Workshop to let Ernie live.

27. By the time Tabor was corrected, the newspapers had already picked up the story.

28. Sesame Street’s Executive Producer Carol-Lynn Parente joined Sesame Workshop as a production assistant and has worked her way to the top.

29. Originally, Count von Count was more sinister. He could hypnotize and stun people.

30. According to Sesame Workshop, all Sesame Street's main Muppets have four fingers except Cookie Monster, who has five.

31. The episode with Mr. Hooper's funeral aired on Thanksgiving Day in 1983. That date was chosen because families were more likely to be together at that time, in case kids had questions or needed emotional support.

32. Mr. Hooper’s first name was Harold.

33. Big Bird sang "Bein' Green" at Jim Henson's memorial service.

34. As Chris Higgins put it, the performance was "devastating."

35. Oscar's Israeli counterpart is Moishe Oofnik, whose last name means “grouch” in Hebrew.

36. Nigeria's version of Cookie Monster eats yams. His catchphrase: "ME WANT YAM!"

37. Sesame's Roosevelt Franklin ran a school, where he spoke in scat and taught about Africa. Some parents hated him, so in 1975 he got the boot, only to inspire Gob Bluth’s racist puppet Franklin on Arrested Development 28 years later.

38. Our good friend and contributor Eddie Deezen was the voice of Donnie Dodo in the 1985 classic Follow That Bird.

39. Cookie Monster evolved from The Wheel-Stealer—a snack-pilfering puppet Jim Henson created to promote Wheels, Crowns and Flutes in the 1960s.

40. This puppet later was seen eating a computer in an IBM training film and on The Ed Sullivan Show.

Thanks to Stacy Conradt, Joe Hennes, Drew Toal, and Chris Higgins for their previous Sesame coverage!

An earlier version of this article appeared in 2012.

How Apple's '1984' Super Bowl Ad Was Almost Canceled

More than 30 years ago, Apple defined the Super Bowl commercial as a cultural phenomenon. Prior to Super Bowl XVIII, nobody watched the game "just for the commercials"—but one epic TV spot, directed by sci-fi legend Ridley Scott, changed all that. Read on for the inside story of the commercial that rocked the world of advertising, even though Apple's Board of Directors didn't want to run it at all.


If you haven't seen it, here's a fuzzy YouTube version:

"WHY 1984 WON'T BE LIKE 1984"

The tagline "Why 1984 Won't Be Like '1984'" references George Orwell's 1949 novel 1984, which envisioned a dystopian future, controlled by a televised "Big Brother." The tagline was written by Brent Thomas and Steve Hayden of the ad firm Chiat\Day in 1982, and the pair tried to sell it to various companies (including Apple, for the Apple II computer) but were turned down repeatedly. When Steve Jobs heard the pitch in 1983, he was sold—he saw the Macintosh as a "revolutionary" product, and wanted advertising to match. Jobs saw IBM as Big Brother, and wanted to position Apple as the world's last chance to escape IBM's domination of the personal computer industry. The Mac was scheduled to launch in late January of 1984, a week after the Super Bowl. IBM already held the nickname "Big Blue," so the parallels, at least to Jobs, were too delicious to miss.

Thomas and Hayden wrote up the story of the ad: we see a world of mind-controlled, shuffling men all in gray, staring at a video screen showing the face of Big Brother droning on about "information purification directives." A lone woman clad in vibrant red shorts and a white tank-top (bearing a Mac logo) runs from riot police, dashing up an aisle towards Big Brother. Just before being snatched by the police, she flings a sledgehammer at Big Brother's screen, smashing him just after he intones "We shall prevail!" Big Brother's destruction frees the minds of the throng, who quite literally see the light, flooding their faces now that the screen is gone. A mere eight seconds before the one-minute ad concludes, a narrator briefly mentions the word "Macintosh," in a restatement of that original tagline: "On January 24th, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984.'" An Apple logo is shown, and then we're out—back to the game.

In 1983, in a presentation about the Mac, Jobs introduced the ad to a cheering audience of Apple employees:

"... It is now 1984. It appears IBM wants it all. Apple is perceived to be the only hope to offer IBM a run for its money. Dealers, initially welcoming IBM with open arms, now fear an IBM-dominated and -controlled future. They are increasingly turning back to Apple as the only force that can ensure their future freedom. IBM wants it all and is aiming its guns on its last obstacle to industry control: Apple. Will Big Blue dominate the entire computer industry? The entire information age? Was George Orwell right about 1984?"

After seeing the ad for the first time, the Apple audience totally freaked out (jump to about the 5-minute mark to witness the riotous cheering).


Chiat\Day hired Ridley Scott, whose 1982 sci-fi film Blade Runner had the dystopian tone they were looking for (and Alien wasn't so bad either). Scott filmed the ad in London, using actual skinheads playing the mute bald men—they were paid $125 a day to sit and stare at Big Brother; those who still had hair were paid to shave their heads for the shoot. Anya Major, a discus thrower and actress, was cast as the woman with the sledgehammer largely because she was actually capable of wielding the thing.

Mac programmer Andy Hertzfeld wrote an Apple II program "to flash impressive looking numbers and graphs on [Big Brother's] screen," but it's unclear whether his program was used for the final film. The ad cost a shocking $900,000 to film, plus Apple booked two premium slots during the Super Bowl to air it—carrying an airtime cost of more than $1 million.


Although Jobs and his marketing team (plus the assembled throng at his 1983 internal presentation) loved the ad, Apple's Board of Directors hated it. After seeing the ad for the first time, board member Mike Markkula suggested that Chiat\Day be fired, and the remainder of the board were similarly unimpressed. Then-CEO John Sculley recalled the reaction after the ad was screened for the group: "The others just looked at each other, dazed expressions on their faces ... Most of them felt it was the worst commercial they had ever seen. Not a single outside board member liked it." Sculley instructed Chiat\Day to sell off the Super Bowl airtime they had purchased, but Chiat\Day principal Jay Chiat quietly resisted. Chiat had purchased two slots—a 60-second slot in the third quarter to show the full ad, plus a 30-second slot later on to repeat an edited-down version. Chiat sold only the 30-second slot and claimed it was too late to sell the longer one. By disobeying his client's instructions, Chiat cemented Apple's place in advertising history.

When Apple co-founder Steve Wozniak heard that the ad was in trouble, he offered to pony up half the airtime costs himself, saying, "I asked how much it was going to cost, and [Steve Jobs] told me $800,000. I said, 'Well, I'll pay half of it if you will.' I figured it was a problem with the company justifying the expenditure. I thought an ad that was so great a piece of science fiction should have its chance to be seen."

But Woz didn't have to shell out the money; the executive team finally decided to run a 100-day advertising extravaganza for the Mac's launch, starting with the Super Bowl ad—after all, they had already paid to shoot it and were stuck with the airtime.

1984 - Big Brother


When the ad aired, controversy erupted—viewers either loved or hated the ad, and it spurred a wave of media coverage that involved news shows replaying the ad as part of covering it, leading to estimates of an additional $5 million in "free" airtime for the ad. All three national networks, plus countless local markets, ran news stories about the ad. "1984" become a cultural event, and served as a blueprint for future Apple product launches. The marketing logic was brilliantly simple: create an ad campaign that sparked controversy (for example, by insinuating that IBM was like Big Brother), and the media will cover your launch for free, amplifying the message.

The full ad famously ran once during the Super Bowl XVIII (on January 22, 1984), but it also ran the month prior—on December 31, 1983, TV station operator Tom Frank ran the ad on KMVT at the last possible time slot before midnight, in order to qualify for 1983's advertising awards.* (Any awards the ad won would mean more media coverage.) Apple paid to screen the ad in movie theaters before movie trailers, further heightening anticipation for the Mac launch. In addition to all that, the 30-second version was aired across the country after its debut on the Super Bowl.

Chiat\Day adman Steve Hayden recalled: "We ran a 30- second version of '1984' in the top 10 U.S. markets, plus, in an admittedly childish move, in an 11th market—Boca Raton, Florida, headquarters for IBM's PC division." Mac team member Andy Hertzfeld ended his remembrance of the ad by saying:

"A week after the Macintosh launch, Apple held its January board meeting. The Macintosh executive staff was invited to attend, not knowing what to expect. When the Mac people entered the room, everyone on the board rose and gave them a standing ovation, acknowledging that they were wrong about the commercial and congratulating the team for pulling off a fantastic launch.

Chiat\Day wanted the commercial to qualify for upcoming advertising awards, so they ran it once at 1 AM at a small television station in Twin Falls, Idaho, KMVT, on December 15, 1983 [incorrect; see below for an update on this -ed]. And sure enough it won just about every possible award, including best commercial of the decade. Twenty years later it's considered one of the most memorable television commercials ever made."


A year later, Apple again employed Chiat\Day to make a blockbuster ad for their Macintosh Office product line, which was basically a file server, networking gear, and a laser printer. Directed by Ridley Scott's brother Tony, the new ad was called "Lemmings," and featured blindfolded businesspeople whistling an out-of-tune version of Snow White's "Heigh-Ho" as they followed each other off a cliff (referencing the myth of lemming suicide).

Jobs and Sculley didn't like the ad, but Chiat\Day convinced them to run it, pointing out that the board hadn't liked the last ad either. But unlike the rousing, empowering message of the "1984" ad, "Lemmings" directly insulted business customers who had already bought IBM computers. It was also weirdly boring—when it was aired at the Super Bowl (with Jobs and Sculley in attendance), nobody really reacted. The ad was a flop, and Apple even proposed running a printed apology in The Wall Street Journal. Jay Chiat shot back, saying that if Apple apologized, Chiat would buy an ad on the next page, apologizing for the apology. It was a mess:


In 2004, the ad was updated for the launch of the iPod. The only change was that the woman with the hammer was now listening to an iPod, which remained clipped to her belt as she ran. You can watch that version too:


Chiat\Day adman Lee Clow gave an interview about the ad, covering some of this material.

Check out Mac team member Andy Hertzfeld's excellent first-person account of the ad. A similar account (but with more from Jobs's point of view) can found in the Steve Jobs biography, and an even more in-depth account is in The Mac Bathroom Reader. The Mac Bathroom Reader is out of print; you can read an excerpt online, including QuickTime movies of the two versions of the ad, plus a behind-the-scenes video. Finally, you might enjoy this 2004 USA Today article about the ad, pointing out that ads for other computers (including Atari, Radio Shack, and IBM's new PCjr) also ran during that Super Bowl.

* = A Note on the Airing in 1983

Update: Thanks to Tom Frank for writing in to correct my earlier mis-statement about the first air date of this commercial. As you can see in his comment below, Hertzfeld's comments above (and the dates cited in other accounts I've seen) are incorrect. Stay tuned for an upcoming interview with Frank, in which we discuss what it was like running both "1984" and "Lemmings" before they were on the Super Bowl!

Update 2: You can read the story behind this post in Chris's book The Blogger Abides.

This post originally appeared in 2012.


More from mental floss studios