Why Do Orchestras Tune to an A Note?

iStock
iStock

When orchestra members tune their instruments before a performance, it almost always sounds the same. That’s because across the world, most orchestras tune to the same A note, using a standard pitch of 440 hertz.

This is the result of international standards that have been in place since the 19th century, according to WQXR, a classical music radio station in New York City. Currently, standard tuning frequency is set by the International Organization for Standardization (ISO), an international group that makes recommendations on everything from what safety labels should look like to how big the hole in a pen cap should be. A standard called ISO 16, first recommended in 1955 and confirmed in 1975, “specifies the frequency for the note A in the treble stave and shall be 440 hertz.”

The ISO didn’t pull that frequency out of thin air. During the Industrial Revolution, a rush toward standardization and universality led to multiple international meetings that aimed to bring orchestras all over the world to the same pitch. Standardizing pitch had important ramifications for the international music scene.

Historically, the pitch that orchestras tuned to could differ wildly depending on where the musicians were playing. “In the course of the last 400 years in Europe, the point that has been considered ideal for a reference pitch has fluctuated by some 5 or 6 semitones,” musicologist Bruce Haynes explained in his book, A History of Performing Pitch: The Story of ‘A.’ In the 17th century, a French performer might tune his or her instrument a whole tone lower than their German colleagues. The standards could even change from one town to the next, affecting how music written in one location might sound when played in another.

As a writer for London's The Spectator observed in 1859, “It is well known that when we are performing Handel's music (for example) from the very notes in which he wrote it, we are really performing it nearly a whole tone higher than he intended;—the sound associated in his ear with the note A, being nearly the same sound which, in our ear, is associated with the note G.”

In the 19th century, a commission established by the French government tried to analyze pitch across Europe by looking at the frequencies of the tuning forks musicians used as their reference while tuning their instruments. The commission gathered tuning forks from different cities, finding that most were pitched somewhere around 445 hertz. Over the years, due to bigger concert halls and more advanced instruments, pitch was rising across most orchestras, and instruments and voices were being strained as a result. So the commission recommended lowering the standard to what was known as “the compromise pitch.”

In 1859, the French commission legally established diapason normal, the standard pitch for the A above middle C, at 435 hertz. (The music world would still be debating whether or not pitch had risen too much more than a century later.) Later, 435 hertz became enshrined as a standard elsewhere, too. In 1885, government representatives from Italy, Austria, Hungary, Prussia, Russia, Saxony, Sweden, and Württemberg met to establish their own international standard, agreeing on 435 hertz. The agreement was eventually written into the Treaty of Versailles in 1919.

But not everyone was on board with 435 hertz. The Royal Philharmonic Society in London believed the French pitch standard was pegged to a specific temperature—59°F—and decided to adjust their pitch upward to compensate for their concert halls being warmer than that, settling on 439 hertz. Meanwhile, in 1917, the American Federation of Musicians declared 440 hertz to be the standard pitch in the U.S.

In 1939, the International Standardizing Organization met in London to agree on a standard for concert pitch to be used across the world. A Dutch study of European pitch that year had found that while pitch varied across orchestras and countries, the average of those varied pitches was around 440 hertz. So it made sense for the ISO to choose A 440. Furthermore, radio broadcasters and technicians like the BBC preferred A 440 to the English A 439 because 439 was a prime number and thus harder to reproduce in a laboratory.

World War II delayed the official launch of the 1939 ISO agreement, but the organization issued its A 440 decision in 1955, then again two decades later. A 440 was here to stay. That said, even now, pitch does vary a little depending on the musicians in question. The Vienna Philharmonic Orchestra notably tunes to 443 hertz rather than the standard 440 hertz, for instance. While A 440 may be the official “concert pitch” across the world, in practice, there is still a little wiggle room.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Where Did the Term Brownie Points Come From?

bhofack2/iStock via Getty Images
bhofack2/iStock via Getty Images

In a Los Angeles Times column published on March 15, 1951, writer Marvin Miles observed a peculiar phrase spreading throughout his circle of friends and the social scene at large. While standing in an elevator, he overheard the man next to him lamenting “lost brownie points.” Later, in a bar, a friend of Miles's who had stayed out too late said he would never “catch up” on his brownie points.

Miles was perplexed. “What esoteric cult was this that immersed men in pixie mathematics?” he wrote. It was, his colleagues explained, a way of keeping “score” with their spouses, of tallying the goodwill they had accrued with the “little woman.”

Over the decades, the phrase brownie points has become synonymous with currying favor, often with authority figures such as teachers or employers. So where exactly did the term come from, and what happens when you “earn” them?

The most pervasive explanation is that the phrase originated with the Brownies, a subsect of the Girl Scouts who were encouraged to perform good deeds in their communities. The Brownies were often too young to be official Girl Scouts and were sometimes the siblings of older members. Originally called Rosebuds in the UK, they were renamed Brownies when the first troops were being organized in 1916. Sir Robert Baden-Powell, who had formed the Boy Scouts and was asked to name this new Girl Scout division, dubbed them Brownies after the magical creatures of Scottish folklore that materialized to selflessly help with household chores.

But the Brownies are not the only potential source. In the 1930s, kids who signed up to deliver magazines like The Saturday Evening Post and Ladies' Home Journal from Curtis Publishing were eligible for vouchers labeled greenies and brownies that they could redeem for merchandise. They were not explicitly dubbed brownie points, but it’s not hard to imagine kids applying a points system to the brownies they earned.

The term could also have been the result of wartime rationing in the 1940s, where red and brown ration points could be redeemed for meats.

The phrase didn’t really seem to pick up steam until Miles's column was published. In this context, the married men speaking to Miles believed brownie points could be collected by husbands who remembered birthdays and anniversaries, stopped to pick up the dry cleaning, mailed letters, and didn’t spend long nights in pubs speaking to newspaper columnists. The goal, these husbands explained, was never to get ahead; they merely wanted to be considered somewhat respectable in the eyes of their wives.

Later, possibly as a result of its usage in print, grade school students took the phrase to mean an unnecessary devotion to teachers in order to win them over. At a family and faculty meeting at Leon High in Tallahassee, Florida, in 1956, earning brownie points was said to be a serious problem. Also called apple polishing, it prompted other students in class to shame their peers for being friendly to teachers. As a result, some were “reluctant to be civil” for fear they would be harassed for sucking up.

In the decades since that time, the idiom has become attached to any act where goodwill can be expected in return, particularly if it’s from someone in a position to reward the act with good grades or a promotion. As for Miles: the columnist declared his understanding of brownie points came only after a long night of investigation. Arriving home late, he said, rendered him “pointless.”

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Grocery Stores vs. Supermarkets: What’s the Difference?

gpointstudio/iStock via Getty Images
gpointstudio/iStock via Getty Images

These days, people across the country are constantly engaging in regional term debates like soda versus pop and fireflies versus lightning bugs. Since these inconsistencies are so common, you might have thought the only difference between a grocery store and a supermarket was whether the person who mentioned one was from Ohio or Texas. In reality, there are distinctions between the stores themselves.

To start, grocery stores have been around for much longer than supermarkets. Back when every town had a bakery, a butcher shop, a greengrocery, and more, the grocery store offered townspeople an efficient shopping experience with myriad food products in one place. John Stranger, vice president group supervisor of the food-related creative agency EvansHardy+Young, explained to Reader’s Digest that the grocer would usually collect the goods for the patron, too. This process might sound familiar if you’ve watched old films or television shows, in which characters often just hand over their shopping lists to the person behind the counter. While our grocery store runs may not be quite so personal today, the contents of grocery stores remain relatively similar: Food, drinks, and some household products.

Supermarkets, on the other hand, have taken the idea of a one-stop shop to another level, carrying a much more expansive array of foodstuffs as well as home goods, clothing, baby products, and even appliances. This is where it gets a little tricky—because supermarkets carry many of the same products as superstores, the next biggest fish in the food store chain, which are also sometimes referred to as hypermarkets.

According to The Houston Chronicle, supermarkets and superstores both order inventory in bulk and usually belong to large chains, whereas grocery stores order products on an as-needed basis and are often independently owned. Superstores, however, are significantly larger than either grocery stores or supermarkets, and they typically look more like warehouses. It’s not an exact science, and some people might have conflicting opinions about how to categorize specific stores. For example, Walmart has a line of Walmart Neighborhood Markets, which its website describes as “smaller-footprint option[s] for communities in need of a pharmacy, affordable groceries, and merchandise.” They’re not independently owned, but they do sound like grocery stores, especially compared to Walmart’s everything-under-the-sun superstore model.

Knowing the correct store terms might not always matter in casual conversation, but it could affect your credit card rewards earnings. American Express, for example, offers additional rewards on supermarket purchases, and it has a specific list of stores that qualify as supermarkets, including Gristedes, Shoprite, Stop & Shop, and Whole Foods. Target and Walmart, on the other hand, are both considered superstores, so you won’t earn bonuses on those purchases.

And, since grocery shopping at any type of store can sometimes seem like a competitive sport, here’s the ideal time to go.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER