How Do Generations Get Their Names?

iStock
iStock

We all know what a Millennial is. There are stereotypes about what Millennials do and do not like, how lazy they may or may not be, and how often they check their Twitter feeds—all because we're comfortable using this single term to refer to an entire age demographic of the population. Millennial is a powerful word, and not because of the age group it refers to, but because of just how useful it is—just like Gen X or Baby Boomer.

There is no single or even typical way that generations historically get their names, because lumping everyone who's roughly the same age together is a relatively new phenomenon.

According to Peter Francese, a demographic and consumer markets expert, Baby Boomers were the first named generation to exist. (Those that came earlier, like The Greatest Generation that fought in World War II, were named retroactively.) It all started when the Census Bureau referred to the years between 1946 and 1964, during which birthrates rocketed up from around 3 million a year to over 4 million a year, as the "Post War Baby Boom." As the kids born in this boom started to grow into adults (and thus, consumers), ad agencies found traction by marketing their products to so-called Baby Boomers. This would be the first (and so far last) time a generation's "official" name would come from a government organization.

Eventually—as will inevitably happen to all of us, even the most maturity-challenged Millennials—the Baby Boomers got older and thus less appealing to companies with something to sell. The ad agencies wanted another catch-all term for the new members of their target age group and began shopping around different terms.

"They throw stuff at the wall and see what sticks," Francese says. "And in some of the meetings, they don’t stick." That's how Generation Y, a proto-term for Millennials, went in and out of fashion. "Generation Y was too difficult to say, too hard to brand, it didn’t have the cachet, it didn’t have the spark of Millennials," Francese says.

Not sticking is a matter of whether or not media organizations start using the term. And not just any media organization. "I’m talking about the Associated Press or Reuters—people who are syndicated that produce lots and lots of editorial content that they send out to various organizations," Francese says. As for determining the dates for Millennials, it all came down to demographics, and the old adage of comparing apples to apples.

"In 2010, which is when they did the census, Baby Boomers were all 45 to 64 years old," Francese explains. "Now, in order to compare Millennials to the Baby Boomers, because they're the next boom, you have to have what? Twenty years. And so in 2010, Millennials are people between 15 and 34. And then they work back from there to figure out when they were born."

If it seems like we're skipping over a generation, that's because we are. And for the most part, ad agencies did too. In 1991, Douglas Coupland wrote his book Generation X: Tales for an Accelerated Culture about the anonymity he and his contemporaries felt growing up in the shadow of the Baby Boomers. They were products of a 10- to 12-year downturn in birthrates sandwiched between the Boomers and the Millennials, and although the term stuck with the general population, the generation was the wrong size to matter much to marketers.

It seems unlikely ad agencies will take such a passive approach again.

"The ad agencies have a mission and an imperative to bring to their clients news of what’s going on in the marketplace," Francese says. " And so, inevitably, they segment the American populations into various groups. The necessity to do that means that they sit around and they come up with names."

The generation currently being born and growing up—the term Generation Z has often been used as a placeholder, though the Pew Research Center recently redefined them as Post-Millennials—is just beginning to acquire consumer value, and will become more powerful in the coming years. When that happens, ad agencies will have a perfectly workshopped label ready to slap on spending reports and style section columns.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

What's the Difference Between a College and a University?

Chinnapong/iStock via Getty Images
Chinnapong/iStock via Getty Images

Going off to college is a milestone in any young adult’s life. The phrase itself conjures up images of newfound independence, exposure to new perspectives, knowledge, and possibly even one or more sips of alcohol.

In America, however, few people use the phrase “going off to university,” or “headed to university,” even if they are indeed about to set off for, say, Harvard University. Why did college become the predominant term for postsecondary education? And is there any difference between the two institutions?

While university appears to be the older of the two terms, dating as far back as the 13th century, schools and students in North America have embraced college to describe most places of higher learning. There is no rigid definition of the words, but there are some general attributes for each. A college is typically a four-year school that offers undergraduate degrees like an associate or a bachelor’s. (Community colleges are often two-year schools.) They don’t typically offer master’s or doctorates, and the size of their student body is typically the smaller of the two.

Universities, on the other hand, tend to offer both undergraduate and graduate programs leading to advanced degrees for a larger group of students. They can also be comprised of several schools—referred to as colleges—under their umbrella. A university could offer both a school of arts and sciences and a school of business. The University of Michigan has a College of Engineering, for example.

While many of these traits are common, they’re not guaranteed. Some colleges can be bigger than universities, some might offer master’s degrees, and so on. To complicate matters further, an institution that fits the criteria of a university might choose to call itself a college. Both Dartmouth College and Boston College qualify as universities but use the college label owing to tradition. Schools may begin as colleges, grow into universities, but retain the original name.

People tend to think of a university as being more prestigious or harder to get into, but there are too many variables to make that determination at a glance. Some colleges might ask more of applicants than universities. Some universities might be smaller than certain colleges. Either one can be public or private.

Things get a little more convoluted abroad. In the UK, students go off to university (or uni) instead of college. The British version of college is typically a two-year program where students either focus on learning one particular skill set (much like a vocational school) or use the time to prepare for exams so that they can advance to university. Language matters, too; in Spanish, colegio usually refers to high school.

While the terms aren’t strictly interchangeable, there is enough of a difference between the two to try and make the distinction. Keep in mind that some states, like New Jersey, have rules about how institutions label themselves. There, a university has to have at least three fields of graduate study leading to advanced degrees.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Why Do We Eat Candy on Halloween?

Jupiterimages/iStock via Getty Images
Jupiterimages/iStock via Getty Images

On October 31, hordes of children armed with Jack-o'-lantern-shaped buckets and pillow cases will take to the streets in search of sugar. Trick-or-treating for candy is synonymous with Halloween, but the tradition had to go through a centuries-long evolution to arrive at the place it is today. So how did the holiday become an opportunity for kids to get free sweets? You can blame pagans, Catholics, and candy companies.

Historians agree that a Celtic autumn festival called Samhain was the precursor to modern Halloween. Samhain was a time to celebrate the last harvest of the year and the approach of the winter season. It was also a festival for honoring the dead. One way Celtics may have appeased the spirits they believed still walked the Earth was by leaving treats on their doorsteps.

When Catholics infiltrated Ireland in the 1st century CE, they rebranded many pagan holidays to fit their religion. November 1 became the “feasts of All Saints and All Souls," and the day before it was dubbed "All-Hallows'-Eve." The new holidays looked a lot different from the original Celtic festival, but many traditions stuck around, including the practice of honoring the dead with food. The food of choice for Christians became "soul cakes," small pastries usually baked with expensive ingredients and spices like currants and saffron.

Instead of leaving them outside for passing ghosts, soul cakes were distributed to beggars who went door-to-door promising to pray for souls of the deceased in exchange for something to eat. Sometimes they wore costumes to honor the saints—something pagans originally did to avoid being harassed by evil spirits. The ritual, known as souling, is believed to have planted the seeds for modern-day trick-or-treating.

Souling didn't survive the holiday's migration from Europe to the United States. In America, the first Halloween celebrations were a way to mark the end-of-year harvest season, and the food that was served mainly consisted of homemade seasonal treats like caramel apples and mixed nuts. There were no soul cakes—or candies, for that matter—to be found.

It wasn't until the 1950s that trick-or-treating gained popularity in the U.S. Following the Great Depression and World War II, the suburbs were booming, and people were looking for excuses to have fun and get to know their neighbors. The old practice of souling was resurrected and made into an excuse for kids to dress up in costumes and roam their neighborhoods. Common trick-or-treat offerings included nuts, coins, and homemade baked goods ("treats" that most kids would turn their noses up at today).

That changed when the candy companies got their hands on the holiday. They had already convinced consumers that they needed candy on Christmas and Easter, and they were looking for an equally lucrative opportunity to market candy in the fall. The new practice of trick-or-treating was almost too good to be true. Manufacturers downsized candies into smaller, bite-sized packages and began marketing them as treats for Halloween. Adults were grateful to have a convenient alternative to baking, kids loved the sweet treats, and the candy companies made billions.

Today, it's hard to imagine Halloween without Skittles, chocolate bars, and the perennial candy corn debates. But when you're digging through a bag or bowl of Halloween candy this October, remember that you could have been having eating soul cakes instead.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER