Why Do Coins Have Ridges?

iStock/tweetyclaw
iStock/tweetyclaw

The stylish rims you might have noticed on U.S. dimes, quarters, half dollars and some dollar coins are called reeded edges. They’ve been on American currency almost since day one as a way of keeping people honest.

The United States Mint built its first minting facility in Philadelphia in 1792. The following March, it produced its first batch of circulating coins - 11,178 copper pennies. The silver coins that soon followed were linked to a silver standard, per the 1792 Mint and Coinage Act. This meant the “major” coins were at least partially made up of the precious metal (the first dollar coin, from 1794, was 89.25% silver and 10.75% copper). Silver dollars contained about a dollar’s worth of silver, give or take, and the others – half dollars, quarters and dimes – had a proportionate metallic content and size. Half-dollar coins contained ½ the amount of silver as a dollar and were half the size, quarters had ¼ the amount of silver, and so on.

Reeded edges served a two-fold security purpose for silver coins. One, they added an additional, intricate element to the coins that made them more difficult to counterfeit. Two, they prevented fraud.

How do ridges prevent fraud?

For as long as coins have been made from precious metal, a fairly common way to make a quick, ill-gotten buck was coin clipping. Clippers would shave off a tiny amount of metal all the way around the rims of a bunch of coins, collect the shavings, then sell them. Working carefully, a coin clipper could trim enough off of coins to make a nice profit, but not so much as to make them noticeably lighter or smaller. A clipper could then still go out and spend his devalued coins as if they were unaltered. Reeded edges ruined this scheme, since a shaved edge would be immediately obvious and alert anyone who received one that something was wrong.

Why don't nickels and pennies have reeded edges? Nickels and pennies are mainly composed of inexpensive metals, so the chances that they would be tampered with are low.

Before their adoption by the U.S. Mint, reeded edges were also used in the UK. When the physicist Isaac Newton became warden of the Royal Mint in 1696, he used reeded edges, among other means, to combat clippers and counterfeiters. Other European coins from as far back as the early 1500s also feature reeded edges.

Wait, are people still clipping coins?

Due to the abandonment of the silver standard and a worldwide silver shortage in the mid-20th century, the Coinage Act of 1965 authorized a change in the composition of dimes, quarters, and half-dollars, gradually shrinking their silver content down to the present-day 0%. Coin clipping is no longer a problem, but reeded edges are still around, a centuries-old security measure hanging on in an age where people pay for things with their smart phones instead of digging out pocket change. The tenacity is admirable. But why are they still there?

Coins are made by stamping coin blanks with a metal tool called a die. The die is engraved with the negative of a coin’s design, and the positive image is transferred to the coin when stamped. When the coins are struck, a part of the die called the collar holds the blank in place and applies the edge. When the silverless coins were first produced, the government didn’t see any need to make or buy expensive new dies or collars. Keeping the reeding wouldn’t hurt anyone, they figured, so the new coins were struck from the same old dies as the old ones, and reeding continued to be used as a matter of tradition and backwards-compatibility. Newer coins with updated designs (state quarters, new portraits) also have reeded edges. The design element lived to see another day on the new dies because reeding is useful for distinguishing coins by feel as well as appearance, making them more user-friendly for the visually impaired.

I can't stand the suspense. How many ridges are on my quarter?

If you gather up a bunch of coins, you'll see that not all reeded edges are created equal. The number and size of reeds on coins is not dictated by law, so individual U.S. Mints were long free to make their reeds to their own in-house specifications, leading to distinct style differences between coins from different mints and eras. Rare dimes from the now-defunct Carson City Mint’s 1871-74 runs, for example, have 89 broad, widely spaced reeds. The dimes made by the Philadelphia Mint in those same years have 113 thin, tightly-spaced reeds. 

Things are a little more standardized now and the Mint lists its reeding specifications as follows: dimes, 118; quarters, 119; half dollars, 150; dollar, 198; Susan B. Anthony dollar, 133.

What's the Difference Between a College and a University?

Chinnapong/iStock via Getty Images
Chinnapong/iStock via Getty Images

Going off to college is a milestone in any young adult’s life. The phrase itself conjures up images of newfound independence, exposure to new perspectives, knowledge, and possibly even one or more sips of alcohol.

In America, however, few people use the phrase “going off to university,” or “headed to university,” even if they are indeed about to set off for, say, Harvard University. Why did college become the predominant term for postsecondary education? And is there any difference between the two institutions?

While university appears to be the older of the two terms, dating as far back as the 13th century, schools and students in North America have embraced college to describe most places of higher learning. There is no rigid definition of the words, but there are some general attributes for each. A college is typically a four-year school that offers undergraduate degrees like an associate or a bachelor’s. (Community colleges are often two-year schools.) They don’t typically offer master’s or doctorates, and the size of their student body is typically the smaller of the two.

Universities, on the other hand, tend to offer both undergraduate and graduate programs leading to advanced degrees for a larger group of students. They can also be comprised of several schools—referred to as colleges—under their umbrella. A university could offer both a school of arts and sciences and a school of business. The University of Michigan has a College of Engineering, for example.

While many of these traits are common, they’re not guaranteed. Some colleges can be bigger than universities, some might offer master’s degrees, and so on. To complicate matters further, an institution that fits the criteria of a university might choose to call itself a college. Both Dartmouth College and Boston College qualify as universities but use the college label owing to tradition. Schools may begin as colleges, grow into universities, but retain the original name.

People tend to think of a university as being more prestigious or harder to get into, but there are too many variables to make that determination at a glance. Some colleges might ask more of applicants than universities. Some universities might be smaller than certain colleges. Either one can be public or private.

Things get a little more convoluted abroad. In the UK, students go off to university (or uni) instead of college. The British version of college is typically a two-year program where students either focus on learning one particular skill set (much like a vocational school) or use the time to prepare for exams so that they can advance to university. Language matters, too; in Spanish, colegio usually refers to high school.

While the terms aren’t strictly interchangeable, there is enough of a difference between the two to try and make the distinction. Keep in mind that some states, like New Jersey, have rules about how institutions label themselves. There, a university has to have at least three fields of graduate study leading to advanced degrees.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Why Do We Eat Candy on Halloween?

Jupiterimages/iStock via Getty Images
Jupiterimages/iStock via Getty Images

On October 31, hordes of children armed with Jack-o'-lantern-shaped buckets and pillow cases will take to the streets in search of sugar. Trick-or-treating for candy is synonymous with Halloween, but the tradition had to go through a centuries-long evolution to arrive at the place it is today. So how did the holiday become an opportunity for kids to get free sweets? You can blame pagans, Catholics, and candy companies.

Historians agree that a Celtic autumn festival called Samhain was the precursor to modern Halloween. Samhain was a time to celebrate the last harvest of the year and the approach of the winter season. It was also a festival for honoring the dead. One way Celtics may have appeased the spirits they believed still walked the Earth was by leaving treats on their doorsteps.

When Catholics infiltrated Ireland in the 1st century CE, they rebranded many pagan holidays to fit their religion. November 1 became the “feasts of All Saints and All Souls," and the day before it was dubbed "All-Hallows'-Eve." The new holidays looked a lot different from the original Celtic festival, but many traditions stuck around, including the practice of honoring the dead with food. The food of choice for Christians became "soul cakes," small pastries usually baked with expensive ingredients and spices like currants and saffron.

Instead of leaving them outside for passing ghosts, soul cakes were distributed to beggars who went door-to-door promising to pray for souls of the deceased in exchange for something to eat. Sometimes they wore costumes to honor the saints—something pagans originally did to avoid being harassed by evil spirits. The ritual, known as souling, is believed to have planted the seeds for modern-day trick-or-treating.

Souling didn't survive the holiday's migration from Europe to the United States. In America, the first Halloween celebrations were a way to mark the end-of-year harvest season, and the food that was served mainly consisted of homemade seasonal treats like caramel apples and mixed nuts. There were no soul cakes—or candies, for that matter—to be found.

It wasn't until the 1950s that trick-or-treating gained popularity in the U.S. Following the Great Depression and World War II, the suburbs were booming, and people were looking for excuses to have fun and get to know their neighbors. The old practice of souling was resurrected and made into an excuse for kids to dress up in costumes and roam their neighborhoods. Common trick-or-treat offerings included nuts, coins, and homemade baked goods ("treats" that most kids would turn their noses up at today).

That changed when the candy companies got their hands on the holiday. They had already convinced consumers that they needed candy on Christmas and Easter, and they were looking for an equally lucrative opportunity to market candy in the fall. The new practice of trick-or-treating was almost too good to be true. Manufacturers downsized candies into smaller, bite-sized packages and began marketing them as treats for Halloween. Adults were grateful to have a convenient alternative to baking, kids loved the sweet treats, and the candy companies made billions.

Today, it's hard to imagine Halloween without Skittles, chocolate bars, and the perennial candy corn debates. But when you're digging through a bag or bowl of Halloween candy this October, remember that you could have been having eating soul cakes instead.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER