What's the Difference Between a Broadway and Off-Broadway Show?

iStock/RightFramePhotoVideo
iStock/RightFramePhotoVideo

Over the years, there's been a lot of debate about what should and shouldn't count as a Broadway play or musical. Still, it's widely agreed that, in order to qualify, a production needs to run at a Broadway theater.

In general, a Broadway theater is defined as one that's located in Manhattan and seats at least 500 people. (Actually being located on Broadway is not a requirement.) Those on the island with 100 to 499 seats are regarded as "Off-Broadway" venues. Meanwhile, establishments with 99 seats or fewer are deemed "Off-Off-Broadway."

If the facility hosts concerts and dance shows more often than it does plays or musicals, it isn't considered a Broadway theater, regardless of the seating situation. Because of this, Carnegie Hall doesn't make the cut—even though the main auditorium has way more than 500 seats (2804, to be precise).

How many Broadway theaters are in Manhattan proper? The industry's national trade association is known as the Broadway League, and, at present, they only recognize 41 legitimate Broadway theaters—with the majority sitting between West 40th and West 53rd Streets in Midtown Manhattan. By comparison, Off-Broadway and Off-Off-Broadway stages are more widely dispersed throughout New York City.

Every year, the Broadway League joins forces with the American Theatre Wing to administer one of the Big Apple's biggest celebrations: The Tony Awards. To be eligible for these prizes, a show must open at a Broadway League-certified Broadway theater at some point in the current season before a designated cut-off date (which for this year was April 25).

Given these rules, the Awards completely ignore Off-Broadway productions. But this doesn't mean that you should. Some of the most popular shows ever conceived started out at Off-Broadway venues. For example, the original production of Little Shop of Horrors opened in 1982 and ran for five years without ever making it to the Great White Way—although a Broadway revival did pop up in 2003.

For many productions, Off-Broadway is a stepping stone. Just a few months after opening up at smaller theaters, Hair, A Chorus Line, and, more recently, Hamilton all made the jump to a Broadway stage.

That transition isn't always easy. Often, new sets have to be built and, sometimes, key players have to be re-cast. Furthermore, as producer Gerald Schoenfeld told Playbill in 2008, the Off-Broadway venue where it all began won't want to be left "high and dry" after the show leaves. "[You'll] probably have to make arrangements with the originating theater," he says, "which probably would require a royalty and possible percentage of net profits."

Broadway productions also come with much higher price tags. When you factor in things like talent fees, rehearsals, and marketing, the average Broadway play costs millions of dollars to produce. An estimate from The New York Times says a Broadway show costs "at least $2.5 million to mount," while larger-scale musicals fall in the $10 million to $15 million range. Playbill broke down the costs for staging the Tony-winning musical Kinky Boots in 2013, which cost $13.5 million to get off the ground.

Unsurprisingly, it's become quite difficult to turn a profit on the Great White Way. According to the Broadway League, only one in five Broadway shows breaks even. Furthermore, those lucky few that actually make money have to run for an average of two years before doing so.

As they say, there's no business like show business …

This story was updated in 2019.

Why Do Students Get Summers Off?

Iam Anupong/iStock via Getty Images
Iam Anupong/iStock via Getty Images

It’s commonly believed that school kids started taking summers off in the 19th century so that they’d have time to work on the farm. Nice as that story is, it isn’t true. Summer vacation has little to do with tilling fields and more to do with sweaty, rich city kids playing hooky—and their sweaty, rich parents.

Before the Civil War, farm kids never had summers off. They went to school during the hottest and coldest months and stayed home during the spring and fall, when crops needed to be planted and harvested. Meanwhile, city kids hit the books all year long—summers included. In 1842, Detroit’s academic year lasted 260 days.

But as cities got denser, they got hotter. Endless lanes of brick and concrete transformed urban blocks into kilns, thanks to what was known as the “urban heat island effect.” That’s when America’s swelling middle and upper class families started hightailing it to the cooler countryside. And that caused a problem. School attendance wasn’t mandatory back then, and classrooms were being left half-empty each summer. Something had to give.

Legislators, in one of those if-you-can’t-beat-‘em-join-‘em moments, started arguing that kids should get summers off anyway. It helped that, culturally, leisure time was becoming more important. With the dawn of labor unions and the eight-hour workday, working adults were getting more time to themselves than ever before. Advocates for vacation time also argued (incorrectly) that the brain was a muscle, and like any muscle, it could suffer injuries if overused. From there, they argued that students shouldn’t go to school year-round because it could strain their brains. To top it off, air conditioning was decades away, and city schools during summertime were miserable, half-empty ovens.

So by the turn of the century, urban districts had managed to cut about 60 schooldays from the most sweltering part of the year. Rural schools soon adopted the same pattern so they wouldn’t fall behind. Business folks obviously saw an opportunity here. The summer vacation biz soon ballooned into what is now one of the country’s largest billion-dollar industries.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Where Did the Term Brownie Points Come From?

bhofack2/iStock via Getty Images
bhofack2/iStock via Getty Images

In a Los Angeles Times column published on March 15, 1951, writer Marvin Miles observed a peculiar phrase spreading throughout his circle of friends and the social scene at large. While standing in an elevator, he overheard the man next to him lamenting “lost brownie points.” Later, in a bar, a friend of Miles's who had stayed out too late said he would never “catch up” on his brownie points.

Miles was perplexed. “What esoteric cult was this that immersed men in pixie mathematics?” he wrote. It was, his colleagues explained, a way of keeping “score” with their spouses, of tallying the goodwill they had accrued with the “little woman.”

Over the decades, the phrase brownie points has become synonymous with currying favor, often with authority figures such as teachers or employers. So where exactly did the term come from, and what happens when you “earn” them?

The most pervasive explanation is that the phrase originated with the Brownies, a subsect of the Girl Scouts who were encouraged to perform good deeds in their communities. The Brownies were often too young to be official Girl Scouts and were sometimes the siblings of older members. Originally called Rosebuds in the UK, they were renamed Brownies when the first troops were being organized in 1916. Sir Robert Baden-Powell, who had formed the Boy Scouts and was asked to name this new Girl Scout division, dubbed them Brownies after the magical creatures of Scottish folklore that materialized to selflessly help with household chores.

But the Brownies are not the only potential source. In the 1930s, kids who signed up to deliver magazines like The Saturday Evening Post and Ladies' Home Journal from Curtis Publishing were eligible for vouchers labeled greenies and brownies that they could redeem for merchandise. They were not explicitly dubbed brownie points, but it’s not hard to imagine kids applying a points system to the brownies they earned.

The term could also have been the result of wartime rationing in the 1940s, where red and brown ration points could be redeemed for meats.

The phrase didn’t really seem to pick up steam until Miles's column was published. In this context, the married men speaking to Miles believed brownie points could be collected by husbands who remembered birthdays and anniversaries, stopped to pick up the dry cleaning, mailed letters, and didn’t spend long nights in pubs speaking to newspaper columnists. The goal, these husbands explained, was never to get ahead; they merely wanted to be considered somewhat respectable in the eyes of their wives.

Later, possibly as a result of its usage in print, grade school students took the phrase to mean an unnecessary devotion to teachers in order to win them over. At a family and faculty meeting at Leon High in Tallahassee, Florida, in 1956, earning brownie points was said to be a serious problem. Also called apple polishing, it prompted other students in class to shame their peers for being friendly to teachers. As a result, some were “reluctant to be civil” for fear they would be harassed for sucking up.

In the decades since that time, the idiom has become attached to any act where goodwill can be expected in return, particularly if it’s from someone in a position to reward the act with good grades or a promotion. As for Miles: the columnist declared his understanding of brownie points came only after a long night of investigation. Arriving home late, he said, rendered him “pointless.”

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER