Why Isn't Fish Considered Meat During Lent?

iStock.com/Nataliia Mysak
iStock.com/Nataliia Mysak

For six Fridays each spring, Catholics observing Lent skip sirloin in favor of fish sticks and swap Big Macs for Filet-O-Fish. Why?

Legend has it that, centuries ago, a medieval pope with connections to Europe's fishing business banned red meat on Fridays to give the industry a boost. That story isn't true. Sunday school teachers have a more theological answer: Jesus fasted for 40 days and died on a Friday. Catholics honor both occasions by making a small sacrifice: avoiding animal flesh one day out of the week. That explanation is dandy for a homily, but it doesn't explain why only red meat and poultry are targeted and seafood is fine.

For centuries, the reason evolved with the fast. In the beginning, some worshippers only ate bread. But by the Middle Ages, they were avoiding meat, eggs, and dairy. By the 13th century, the meat-fish divide was firmly established—and Saint Thomas Aquinas gave a lovely answer explaining why: sex, simplicity, and farts.

In Part II of his Summa Theologica, Aquinas wrote:

"Fasting was instituted by the Church in order to bridle the concupiscences of the flesh, which regard pleasures of touch in connection with food and sex. Wherefore the Church forbade those who fast to partake of those foods which both afford most pleasure to the palate, and besides are a very great incentive to lust. Such are the flesh of animals that take their rest on the earth, and of those that breathe the air and their products."

Put differently, Aquinas thought fellow Catholics should abstain from eating land-locked animals because they were too darn tasty. Lent was a time for simplicity, and he suggested that everyone tone it down. It makes sense. In the 1200s, meat was a luxury. Eating something as decadent as beef was no way to celebrate a holiday centered on modesty. But Aquinas had another reason, too: He believed meat made you horny.

"For, since such like animals are more like man in body, they afford greater pleasure as food, and greater nourishment to the human body, so that from their consumption there results a greater surplus available for seminal matter, which when abundant becomes a great incentive to lust. Hence the Church has bidden those who fast to abstain especially from these foods."

There you have it. You can now blame those impure thoughts on a beef patty. (Aquinas might have had it backwards. According to the American Dietetic Association, red meat doesn't boost "seminal matter." Men trying to increase their sperm count are generally advised to cut back on meat. However, red meat does improve testosterone levels, so it's give-and-take.)

Aquinas gave a third reason to avoid meat—it won't give you gas. "Those who fast," Aquinas wrote, "are forbidden the use of flesh meat rather than of wine or vegetables, which are flatulent foods." Aquinas argued that "flatulent foods" gave your "vital spirit" a quick pick-me-up. Meat, on the other hand, boosts the body's long-lasting, lustful humors—a religious no-no.

But why isn't fish considered meat?

The reason is foggy. Saint Paul's first letter to the Corinthians, for one, has been used to justify fasting rules. Paul wrote, " … There is one kind of flesh of men, another flesh of beasts, another of fish, and another of birds" (15:39). That distinction was possibly taken from Judaism's own dietary restrictions, which separates fleishig (which includes land-locked mammals and fowl) from pareve (which includes fish). Neither the Torah, Talmud, or New Testament clearly explains the rationale behind the divide.

It's arbitrary, anyway. In the 17th century, the Bishop of Quebec ruled that beavers were fish. In Latin America, it's OK to eat capybara—apparently also a fish—on Lenten Fridays. Churchgoers around Detroit can guiltlessly munch on muskrat every Friday. And in 2010, the Archbishop of New Orleans gave alligator the thumbs up when he declared, “Alligator is considered in the fish family."

Thanks to King Henry VIII and Martin Luther, Protestants don't have to worry about their diet. When Henry ruled, fish was one of England's most popular dishes. But when the Church refused to grant the King a divorce, he broke from the Church. Consuming fish became a pro-Catholic political statement. Anglicans and the King's sympathizers made it a point to eat meat on Fridays. Around that same time, Martin Luther declared that fasting was up to the individual, not the Church. Those attitudes hurt England's fishing industry so much that, in 1547, Henry's son King Edward VI—who was just 10 at the time—tried to reinstate the fast to improve the country's fishing economy. Some Anglicans picked the practice back up, but Protestants—who were strongest in Continental Europe—didn't need to take the bait.

This story was updated in 2019.

What's the Difference Between a College and a University?

Chinnapong/iStock via Getty Images
Chinnapong/iStock via Getty Images

Going off to college is a milestone in any young adult’s life. The phrase itself conjures up images of newfound independence, exposure to new perspectives, knowledge, and possibly even one or more sips of alcohol.

In America, however, few people use the phrase “going off to university,” or “headed to university,” even if they are indeed about to set off for, say, Harvard University. Why did college become the predominant term for postsecondary education? And is there any difference between the two institutions?

While university appears to be the older of the two terms, dating as far back as the 13th century, schools and students in North America have embraced college to describe most places of higher learning. There is no rigid definition of the words, but there are some general attributes for each. A college is typically a four-year school that offers undergraduate degrees like an associate or a bachelor’s. (Community colleges are often two-year schools.) They don’t typically offer master’s or doctorates, and the size of their student body is typically the smaller of the two.

Universities, on the other hand, tend to offer both undergraduate and graduate programs leading to advanced degrees for a larger group of students. They can also be comprised of several schools—referred to as colleges—under their umbrella. A university could offer both a school of arts and sciences and a school of business. The University of Michigan has a College of Engineering, for example.

While many of these traits are common, they’re not guaranteed. Some colleges can be bigger than universities, some might offer master’s degrees, and so on. To complicate matters further, an institution that fits the criteria of a university might choose to call itself a college. Both Dartmouth College and Boston College qualify as universities but use the college label owing to tradition. Schools may begin as colleges, grow into universities, but retain the original name.

People tend to think of a university as being more prestigious or harder to get into, but there are too many variables to make that determination at a glance. Some colleges might ask more of applicants than universities. Some universities might be smaller than certain colleges. Either one can be public or private.

Things get a little more convoluted abroad. In the UK, students go off to university (or uni) instead of college. The British version of college is typically a two-year program where students either focus on learning one particular skill set (much like a vocational school) or use the time to prepare for exams so that they can advance to university. Language matters, too; in Spanish, colegio usually refers to high school.

While the terms aren’t strictly interchangeable, there is enough of a difference between the two to try and make the distinction. Keep in mind that some states, like New Jersey, have rules about how institutions label themselves. There, a university has to have at least three fields of graduate study leading to advanced degrees.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Why Do We Eat Candy on Halloween?

Jupiterimages/iStock via Getty Images
Jupiterimages/iStock via Getty Images

On October 31, hordes of children armed with Jack-o'-lantern-shaped buckets and pillow cases will take to the streets in search of sugar. Trick-or-treating for candy is synonymous with Halloween, but the tradition had to go through a centuries-long evolution to arrive at the place it is today. So how did the holiday become an opportunity for kids to get free sweets? You can blame pagans, Catholics, and candy companies.

Historians agree that a Celtic autumn festival called Samhain was the precursor to modern Halloween. Samhain was a time to celebrate the last harvest of the year and the approach of the winter season. It was also a festival for honoring the dead. One way Celtics may have appeased the spirits they believed still walked the Earth was by leaving treats on their doorsteps.

When Catholics infiltrated Ireland in the 1st century CE, they rebranded many pagan holidays to fit their religion. November 1 became the “feasts of All Saints and All Souls," and the day before it was dubbed "All-Hallows'-Eve." The new holidays looked a lot different from the original Celtic festival, but many traditions stuck around, including the practice of honoring the dead with food. The food of choice for Christians became "soul cakes," small pastries usually baked with expensive ingredients and spices like currants and saffron.

Instead of leaving them outside for passing ghosts, soul cakes were distributed to beggars who went door-to-door promising to pray for souls of the deceased in exchange for something to eat. Sometimes they wore costumes to honor the saints—something pagans originally did to avoid being harassed by evil spirits. The ritual, known as souling, is believed to have planted the seeds for modern-day trick-or-treating.

Souling didn't survive the holiday's migration from Europe to the United States. In America, the first Halloween celebrations were a way to mark the end-of-year harvest season, and the food that was served mainly consisted of homemade seasonal treats like caramel apples and mixed nuts. There were no soul cakes—or candies, for that matter—to be found.

It wasn't until the 1950s that trick-or-treating gained popularity in the U.S. Following the Great Depression and World War II, the suburbs were booming, and people were looking for excuses to have fun and get to know their neighbors. The old practice of souling was resurrected and made into an excuse for kids to dress up in costumes and roam their neighborhoods. Common trick-or-treat offerings included nuts, coins, and homemade baked goods ("treats" that most kids would turn their noses up at today).

That changed when the candy companies got their hands on the holiday. They had already convinced consumers that they needed candy on Christmas and Easter, and they were looking for an equally lucrative opportunity to market candy in the fall. The new practice of trick-or-treating was almost too good to be true. Manufacturers downsized candies into smaller, bite-sized packages and began marketing them as treats for Halloween. Adults were grateful to have a convenient alternative to baking, kids loved the sweet treats, and the candy companies made billions.

Today, it's hard to imagine Halloween without Skittles, chocolate bars, and the perennial candy corn debates. But when you're digging through a bag or bowl of Halloween candy this October, remember that you could have been having eating soul cakes instead.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER