Why Do Baseball Pitchers Stand on a Mound?

iStock
iStock

Why do baseball pitchers stand on a mound?

Charles Tips:

1884 was a banner year in professional baseball.

  • It was the first year pitchers could legally pitch overhand.
  • It was the season that set the stage for the World Series.
  • It was the year that baseball gloves made their debut.
  • Also, Charlie “Old Hoss” Radbourn set the most unassailable record in baseball that year with 62 (counting his post-season victories), 60 or 59 wins as a pitcher, according to various interpretations of the rules. And that was in a 112-game season.

Old Hoss’s Providence Grays won the National League with a record of 84 and 28 over the runner-up Boston Beaneaters at 73 and 38. They then swept the New York Metropolitans, champions of the American Association, three games to none at the Polo Grounds in a series billed by tabloids as the first “world championship” of baseball. Old Hoss recorded all three wins.

That’s Old Hoss Radbourn pictured above, I believe from the ’86 season after many of the Grays were acquired by the Beaneaters. As the picture hints, Radbourn held a second historical distinction, the first man photographed, not once but twice, unambiguously “shooting the finger.” He was a legendarily fierce competitor.

This was bare-knuckle, bare-hand baseball. There were no relief pitchers. In fact, the rules forbade substitutions for any player not pretty much totally incapacitated. You start the game, you finish the game, even if it went extra innings.

The changes that began in 1884, especially allowing overhand pitching, reverberated through baseball to produce the modern game. It soon led to the pitcher’s mound but a lot more besides.

Hello, Pitcher’s Box

Yes, Old Hoss pitched underhanded, though occasionally in 1884 overhanded, but think of the style of a Kent Tekulve, Dennis Eckersley, Dan Quisenberry, Chad Bradford, or Byung-Hyun Kim to get a better idea of what batters faced.

He, like all other pitchers of the day, pitched from a box using a run-up. The box was level with the field, 4-feet wide and 6-feet long. The front of the box was a mere 50 feet from the plate.

Bye-Bye, Upper Strike Zone

One of the curiosities of baseball is the strike zone.

Rule 2.00: The Strike Zone

The STRIKE ZONE is that area over home plate the upper limit of which is a horizontal line at the midpoint between the top of the shoulders and the top of the uniform pants, and the lower level is a line at the hollow beneath the kneecap. The Strike Zone shall be determined from the batter's stance as the batter is prepared to swing at a pitched ball.

The rules clearly state to this day that the upper limit of the strike zone extends to the middle of the chest; yet, as every fan knows, umpires won’t call a strike much above the belt, if that. What gives?

What gives is that in 1884 there were two strike zones—upper and lower. On taking his turn at bat, a batter would inform the umpire (there was only one per game then, which also led to some interesting baseball rules) which zone to call, and the umpire would duly inform the pitcher. As overhand pitchers grew to dominate, the upper strike zone fell out of use.

Hello, Gloves

By 1884, protective masks had been around for some few umpires and catchers for a year or two. However, a hard-hit foul ball straight to the mask would often snap the fencing wire used to fashion these homemade affairs, lacerating the wearer’s face. They were not widely adopted.

1889 photo of the hands of retired bare-handed catcher Doug Allison.

But toward the latter part of the 1884 season, Grays’ second baseman Jack Farrell broke two fingers on his non-throwing hand, leading him to make a cushioned leather glove so that he could continue to play. Given that he was a star player on the championship team, beginning the following season, young players started imitating him, despite the derision of their teammates. Within a few seasons, mitts, gloves and proper chest protectors and masks for catchers and umpires were standard equipment.

Bye-Bye, Pitcher’s BoxHello, Mound

It was 1893 that the pitcher’s box was replaced by a pitcher’s rubber, an actual slab of rubber a foot wide, moved back to 60 and a half feet from the plate. The rubber could be on a mound raised above field level.

Overhand pitching had so come to dominate baseball that it was felt that the added distance together with the lack of run-up would re-balance offense and defense. Sure enough, the league batting average shot up 39 points in ’93 and another 29 points in ’94. But by 1904, the rules were changed to limit mound height to no more than 15 inches to counter the fact that some pitchers wanted the mound quite high.

It was not long before teams were gaming the discretion allowed for mound height. “Downhill” pitchers preferred the mound as tall as possible. Submariners, on the other hand, preferred level. The Yankees kept theirs level at all times, but other teams took to rebuilding their mound to favor the home team’s starter on a daily basis—no small undertaking. I believe it was the Cleveland Indians under GM Bill Veeck that finally provoked MLB in 1950 to implement a 15-inch rule—all mounds raised 15 inches above the playing field, period.

That, however, put a premium on the downhill pitching style of pitchers like Bob Feller and Don Gibson. Pretty soon, a generation of dominating downhillers had squelched offense again. Before the 1969 season, MLB lowered all mounds to 10 inches, a move that did get offenses going again, which in turn seemed to please the fans, leading seven years later to the last big rule change—the Designated Hitter in the American League only.

This post originally appeared on Quora. Click here to view.

Where Did the Term Brownie Points Come From?

bhofack2/iStock via Getty Images
bhofack2/iStock via Getty Images

In a Los Angeles Times column published on March 15, 1951, writer Marvin Miles observed a peculiar phrase spreading throughout his circle of friends and the social scene at large. While standing in an elevator, he overheard the man next to him lamenting “lost brownie points.” Later, in a bar, a friend of Miles's who had stayed out too late said he would never “catch up” on his brownie points.

Miles was perplexed. “What esoteric cult was this that immersed men in pixie mathematics?” he wrote. It was, his colleagues explained, a way of keeping “score” with their spouses, of tallying the goodwill they had accrued with the “little woman.”

Over the decades, the phrase brownie points has become synonymous with currying favor, often with authority figures such as teachers or employers. So where exactly did the term come from, and what happens when you “earn” them?

The most pervasive explanation is that the phrase originated with the Brownies, a subsect of the Girl Scouts who were encouraged to perform good deeds in their communities. The Brownies were often too young to be official Girl Scouts and were sometimes the siblings of older members. Originally called Rosebuds in the UK, they were renamed Brownies when the first troops were being organized in 1916. Sir Robert Baden-Powell, who had formed the Boy Scouts and was asked to name this new Girl Scout division, dubbed them Brownies after the magical creatures of Scottish folklore that materialized to selflessly help with household chores.

But the Brownies are not the only potential source. In the 1930s, kids who signed up to deliver magazines like The Saturday Evening Post and Ladies' Home Journal from Curtis Publishing were eligible for vouchers labeled greenies and brownies that they could redeem for merchandise. They were not explicitly dubbed brownie points, but it’s not hard to imagine kids applying a points system to the brownies they earned.

The term could also have been the result of wartime rationing in the 1940s, where red and brown ration points could be redeemed for meats.

The phrase didn’t really seem to pick up steam until Miles's column was published. In this context, the married men speaking to Miles believed brownie points could be collected by husbands who remembered birthdays and anniversaries, stopped to pick up the dry cleaning, mailed letters, and didn’t spend long nights in pubs speaking to newspaper columnists. The goal, these husbands explained, was never to get ahead; they merely wanted to be considered somewhat respectable in the eyes of their wives.

Later, possibly as a result of its usage in print, grade school students took the phrase to mean an unnecessary devotion to teachers in order to win them over. At a family and faculty meeting at Leon High in Tallahassee, Florida, in 1956, earning brownie points was said to be a serious problem. Also called apple polishing, it prompted other students in class to shame their peers for being friendly to teachers. As a result, some were “reluctant to be civil” for fear they would be harassed for sucking up.

In the decades since that time, the idiom has become attached to any act where goodwill can be expected in return, particularly if it’s from someone in a position to reward the act with good grades or a promotion. As for Miles: the columnist declared his understanding of brownie points came only after a long night of investigation. Arriving home late, he said, rendered him “pointless.”

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Grocery Stores vs. Supermarkets: What’s the Difference?

gpointstudio/iStock via Getty Images
gpointstudio/iStock via Getty Images

These days, people across the country are constantly engaging in regional term debates like soda versus pop and fireflies versus lightning bugs. Since these inconsistencies are so common, you might have thought the only difference between a grocery store and a supermarket was whether the person who mentioned one was from Ohio or Texas. In reality, there are distinctions between the stores themselves.

To start, grocery stores have been around for much longer than supermarkets. Back when every town had a bakery, a butcher shop, a greengrocery, and more, the grocery store offered townspeople an efficient shopping experience with myriad food products in one place. John Stranger, vice president group supervisor of the food-related creative agency EvansHardy+Young, explained to Reader’s Digest that the grocer would usually collect the goods for the patron, too. This process might sound familiar if you’ve watched old films or television shows, in which characters often just hand over their shopping lists to the person behind the counter. While our grocery store runs may not be quite so personal today, the contents of grocery stores remain relatively similar: Food, drinks, and some household products.

Supermarkets, on the other hand, have taken the idea of a one-stop shop to another level, carrying a much more expansive array of foodstuffs as well as home goods, clothing, baby products, and even appliances. This is where it gets a little tricky—because supermarkets carry many of the same products as superstores, the next biggest fish in the food store chain, which are also sometimes referred to as hypermarkets.

According to The Houston Chronicle, supermarkets and superstores both order inventory in bulk and usually belong to large chains, whereas grocery stores order products on an as-needed basis and are often independently owned. Superstores, however, are significantly larger than either grocery stores or supermarkets, and they typically look more like warehouses. It’s not an exact science, and some people might have conflicting opinions about how to categorize specific stores. For example, Walmart has a line of Walmart Neighborhood Markets, which its website describes as “smaller-footprint option[s] for communities in need of a pharmacy, affordable groceries, and merchandise.” They’re not independently owned, but they do sound like grocery stores, especially compared to Walmart’s everything-under-the-sun superstore model.

Knowing the correct store terms might not always matter in casual conversation, but it could affect your credit card rewards earnings. American Express, for example, offers additional rewards on supermarket purchases, and it has a specific list of stores that qualify as supermarkets, including Gristedes, Shoprite, Stop & Shop, and Whole Foods. Target and Walmart, on the other hand, are both considered superstores, so you won’t earn bonuses on those purchases.

And, since grocery shopping at any type of store can sometimes seem like a competitive sport, here’s the ideal time to go.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER