CLOSE
Original image

Answering Your Burning Grammar Questions

Original image

We're joined this week by a special guest blogger. Patricia T. O'Conner, a former editor at The New York Times Book Review, is the author of the national best-seller Woe Is I: The Grammarphobe's Guide to Better English in Plain English, as well as other books about language. She is a regular monthly guest on public radio station WNYC in New York. Learn more at her website, grammarphobia.com. Today she's answering questions from our readers.

Q: "All right "¦ so there's no good reason to not end a sentence in a preposition "¦ but that doesn't mean that I have to like hearing, "˜Where you at.'"—Posted by Fruppi on 5/5

A: The problem with "Where you at?" isn't that it ends in a preposition. The problem is that it shouldn't have a preposition at all. (What it ought to have is a verb!)

Constructions like "Where is my car at?" and "Where are my keys at?" are considered substandard usage because "where" makes the addition of "at" redundant. "Where" essentially means "at (or in) what place," so adding another "at" is overkill. It's roughly equivalent to saying, "In which pocket are they in?"

Q: "Can we look forward to a discussion of the singular they this week?"—Posted by s michael c on 5/5

A: I didn't discuss this on the blog but I'm glad you brought it up. The singular they or them or their has been considered wrong for a couple of centuries, and it's still a no-no. (Example: "If anybody uses a cell phone, tell them not to.") But it's become so common that only a few of us diehards notice anymore! That doesn't make it right, though. They, them or their are not legitimate singular pronouns, according to nearly all usage and style guides. And I don't like using "he or she" and "him or her," either.

Here's some historical perspective. Once upon a time, English speakers routinely used they to refer to indefinite pronouns that take singular verbs, like anyone, anybody, nobody, and someone. The Oxford English Dictionary has published references for this usage going back to the 16th century. But in the late 18th and early 19th centuries, grammarians began condemning the use of they as a singular pronoun on the grounds that it was illogical. Numerically speaking, they were right, but this left us with a great big hole in English where a gender-neutral, number-neutral pronoun ought to be.

That's the way things stand now, despite all the history, leaving the careful writer with the problem of finding an acceptable alternative to the singular they.

Here's one solution: In a long piece of writing, you might use "him" in some places and "her" in others when referring to a generic individual. Another solution is to write around the problem—don't use the pronoun at all. Example: "Someone forgot to pay the bills" (instead of "their bills"). Or: "If anyone calls, say I'm out" (instead of "tell them I'm out").

If you do use they, them, or their, then make the subject (or referent noun) plural instead of singular. A sentence like "Every parent dotes on their child" could instead be "All parents dote on their children." Instead of "A person should mind their own business," make it "People should mind their own business." Be creative. Disregarding the plural nature of they isn't the answer.

Q: "Would you please address the misuse/overuse of the word myself? It seems the use of the word has become more popular lately. One example I hear a lot is "˜Myself and my friends"¦.' This sounds so wrong to me, or am I incorrect? Another one is irregardless. Is that a real word?"—Posted by JaneM on 5/6

A: People use myself when they can't decide between "I" and "me." This isn't just a cop-out; it's bad English. The word myself is reserved for two uses: (1) To emphasize: "Let me do it myself." (2) To refer to a subject already mentioned: "I can see myself in the mirror." If you could just as well use "I" or "me," then you shouldn't resort to myself.

As for irregardless, it's definitely out of bounds. It blends "regardless" with "irrespective," and the result is a redundancy that has both a negative prefix and a negative suffix! As one reader (lala) so cleverly commented, it's a one-word double negative! Is it real? Well, lots of people use irregardless and you'll find it in dictionaries, so it's real all right. But not everything in a dictionary is good English. Read the fine print: Both Merriam-Webster's Collegiate Dictionary (11th ed.) and The American Heritage Dictionary of the English Language (4th ed.) call it "nonstandard."

Q: "If President Bush (41) and President Bush (43) were walking down the street together, what would be the correct statement? "˜Here come the Presidents Bush "¦ the Bush Presidents "¦ the President Bushes'? Or, "˜Here comes President Bush and President Bush'? These questions must be answered before the next President is inaugurated."—Posted by Witty Nickname on 5/6

A: Your first suggestion is right: "the Presidents Bush." Similarly, Presidents John Adams and John Quincy Adams are often referred to jointly as "the Presidents Adams" or "both Presidents Adams." When in doubt, think of Dostoyevsky (The Brothers Karamazov).

Q: "What's the best contraction for "˜am not'? For example, how should one best end this sentence: "˜Since contractions are required, I'm forced to use one now, am I not?'"—Posted by John on 5/7

A. This is a very interesting question! The answer (aren't I) takes us back to the history of the most fascinating contraction of them all: ain't.

Today, ain't is considered the poster child of poor English, but it wasn't always so. It was probably first used around 1600, just when most of our English contractions—all perfectly legitimate, I might add—were being formed: don't, can't, isn't, and many more. For centuries, ain't was just one of the crowd. It was first seen in print in the late 1600s, spelled an't, a'n't, and eventually ain't. (Some scholars believe the new spelling may have reflected the way the word was pronounced by certain speakers.)

Ain't was originally a contraction of "am not" and "are not." But by the early 1700s, it was also being used as a contraction for "is not." And by the 1800s it was used for "have not" and "has not" too, replacing an earlier contraction, ha'n't. Naturally, as ain't took on more and more meanings it drifted further and further from its roots, and here's where the grammarians and schoolmarms took notice. Contractions like can't and don't had clearly traceable parentage, but ain't had so many possible parents that it seemed illegitimate. So 19th-century critics turned up their noses and declared ain't a crime against good English.

That created a problem, of course—what to use in place of ain't I as a contraction for "am I not." The obsolete "amn't I" was a tongue-twister (it survives today only in Scots and Irish English). As we all know by now, we ended up with aren't I, which clearly makes no sense. How can we justify it if we don't say "I aren't"? And how did it come about, anyway?

As it happens, aren't I didn't exist until the early 20th century, when British novelists and dramatists started using it to reproduce the way upper-class speakers pronounced ain't I. (In the mouth of an old Etonian, ain't rhymed with "taunt" rather than "taint.") Illogical it may be, but aren't I caught on in both Britain and the United States. It may have come out of left field, but today it's standard English while ain't I definitely isn't.

Too bad. I rather like ain't, though I'm too cowardly to use it. If it hadn't outgrown its old meanings of "am not" and "are not," it might be acceptable today. And we'd have a sensible contraction for "am I not."

Q: "The English/Irish refer to a team as a plural thing ("˜England are playing great football this season'). I realize the English invented English but this drives me nuts! To me it is a non-issue. A team was, is, and always will be ONE team, no matter if there are 2 people or 2,000 people. A couple is always two but it is still just one couple. And certainly not to argue with you but I don't like your example "˜A couple of tenants own geckos.' I think the only reason it sounds acceptable is because the word tenants is plural. But you always have to ignore prepositional phrases. Anyway, just my two cents."—Posted by Rob on 5/8

A: The British have a much broader attitude toward collective nouns than we do. To us, "team" is singular, but to them it's a collective that they treat as a plural. In fact, things like soccer teams ("Manchester are leading"), companies ("Mobil plan to invest"), and government bodies ("the Cabinet have met") are all treated as plural in Britain.

They use punctuation marks and articles (a, an, the) and all sorts of other things differently, too. But do NOT assume that British English is purer or more correct than American English. Many characteristics that we identify with modern-day British English—the different usages, spellings, vocabulary words, some points of grammar, even the British accent with its broad a's and dropped r's—developed after the Revolutionary War. Remember that the Colonists brought with them 17th- and 18th-century British English, much of which has been preserved on our side of the Atlantic (and much of which has been altered on theirs). So what's considered correct in London is not necessarily correct in Philadelphia. A chapter in my next book will be devoted to this issue, which I discussed recently on my blog. Here's a link.

As for the collective noun couple, I don't agree that an attached prepositional phrase should be ignored when you're deciding whether the word is singular or plural. Certainly it's singular here: "The couple next-door vacations in Hawaii." But just as certainly it's plural here: "A couple of my friends vacation in Hawaii." And couple is plural here even without a prepositional phrase, because it's assumed: "Where do your friends vacation?" "¦ "A couple [of them] vacation in Hawaii, and a couple more prefer ski resorts."

Q: "I feel like I remember having read in my old college Chicago Manual of Style that there are a select few proper names for which the possessive is ' and not 's. I think one was Jesus (as in "˜He followed Jesus' teachings,' not "˜He followed Jesus's teachings'). I think it was the same for Moses and Sophocles "¦ am I making this up?"—Posted by lala on 5/8

A: You remember correctly! The usual practice in making names possessive is to add an apostrophe plus s. But there's an exception. When a Biblical or classical name ends in s, the custom is to add just the apostrophe: Jesus' disciples, Hercules' strength, Xerxes' writings, Archimedes' principle.

We also drop the s and use only the apostrophe in certain idiomatic expressions with the word "sake" (this avoids a pileup of sibilants). Examples: "for goodness' sake," "for conscience' sake," "for righteousness' sake," "for convenience' sake."

Q: "OK, so this has always really bugged me: is it the 1970s or the 1970's? For example, "˜I was born in the 1970s.' Or, "˜I was born in the 1970's.' I was always under the impression the apostrophe was erroneous, but I guess I might be wrong!"—Posted by Beth on 5/8

A: It's true that you never add an apostrophe to make an ordinary noun plural. But the plurals of numbers are another matter, a style issue that publishers have differed on over the years. In the first two editions of my book Woe Is I, my recommendation was to add an apostrophe plus s to make a number plural: 3's, for example, and 1970's. This was the style then recommended by the New York Times. Since then, both I and the Times have changed our opinions.

I now advise using only the s, with no apostrophe: 3s and 1970s. The third edition of my book Woe Is I (due out next year) and the children's edition, Woe Is I Jr. (published in 2007), reflect this change. I still recommend using the apostrophe to pluralize a single letter for the sake of readability. Without it, a sentence like this is gibberish: "My name is full of as, is, and us." Translation: "My name is full of a's, i's, and u's."

Yesterday: Five Lessons in Punctuation. Wednesday: Five Lessons in Grammar. Tuesday: Debunking Etymological Myths. Monday: Debunking Grammar Myths.

Original image
iStock / Collage by Jen Pinkowski
arrow
The Elements
9 Diamond-Like Facts About Carbon
Original image
iStock / Collage by Jen Pinkowski

How well do you know the periodic table? Our series The Elements explores the fundamental building blocks of the observable universe—and their relevance to your life—one by one.
 
 
It can be glittering and hard. It can be soft and flaky. It can look like a soccer ball. Carbon is the backbone of every living thing—and yet it just might cause the end of life on Earth as we know it. How can a lump of coal and a shining diamond be composed of the same material? Here are eight things you probably didn't know about carbon.

1. IT'S THE "DUCT TAPE OF LIFE."

It's in every living thing, and in quite a few dead ones. "Water may be the solvent of the universe," writes Natalie Angier in her classic introduction to science, The Canon, "but carbon is the duct tape of life." Not only is carbon duct tape, it's one hell of a duct tape. It binds atoms to one another, forming humans, animals, plants and rocks. If we play around with it, we can coax it into plastics, paints, and all kinds of chemicals.

2. IT'S ONE OF THE MOST ABUNDANT ELEMENTS IN THE UNIVERSE.

It sits right at the top of the periodic table, wedged in between boron and nitrogen. Atomic number 6, chemical sign C. Six protons, six neutrons, six electrons. It is the fourth most abundant element in the universe after hydrogen, helium, and oxygen, and 15th in the Earth's crust. While its older cousins hydrogen and helium are believed to have been formed during the tumult of the Big Bang, carbon is thought to stem from a buildup of alpha particles in supernova explosions, a process called supernova nucleosynthesis.

3. IT'S NAMED AFTER COAL.

While humans have known carbon as coal and—after burning—soot for thousands of years, it was Antoine Lavoisier who, in 1772, showed that it was in fact a unique chemical entity. Lavoisier used an instrument that focused the Sun's rays using lenses which had a diameter of about four feet. He used the apparatus, called a solar furnace, to burn a diamond in a glass jar. By analyzing the residue found in the jar, he was able to show that diamond was comprised solely of carbon. Lavoisier first listed it as an element in his textbook Traité Élémentaire de Chimie, published in 1789. The name carbon derives from the French charbon, or coal.

4. IT LOVES TO BOND.

It can form four bonds, which it does with many other elements, creating hundreds of thousands of compounds, some of which we use daily. (Plastics! Drugs! Gasoline!) More importantly, those bonds are both strong and flexible.

5. NEARLY 20 PERCENT OF YOUR BODY IS CARBON.

May Nyman, a professor of inorganic chemistry at Oregon State University in Corvallis, Oregon tells Mental Floss that carbon has an almost unbelievable range. "It makes up all life forms, and in the number of substances it makes, the fats, the sugars, there is a huge diversity," she says. It forms chains and rings, in a process chemists call catenation. Every living thing is built on a backbone of carbon (with nitrogen, hydrogen, oxygen, and other elements). So animals, plants, every living cell, and of course humans are a product of catenation. Our bodies are 18.5 percent carbon, by weight.

And yet it can be inorganic as well, Nyman says. It teams up with oxygen and other substances to form large parts of the inanimate world, like rocks and minerals.

6. WE DISCOVERED TWO NEW FORMS OF IT ONLY RECENTLY.

Carbon is found in four major forms: graphite, diamonds, fullerenes, and graphene. "Structure controls carbon's properties," says Nyman.  Graphite ("the writing stone") is made up of loosely connected sheets of carbon formed like chicken wire. Penciling something in actually is just scratching layers of graphite onto paper. Diamonds, in contrast, are linked three-dimensionally. These exceptionally strong bonds can only be broken by a huge amount of energy. Because diamonds have many of these bonds, it makes them the hardest substance on Earth.

Fullerenes were discovered in 1985 when a group of scientists blasted graphite with a laser and the resulting carbon gas condensed to previously unknown spherical molecules with 60 and 70 atoms. They were named in honor of Buckminster Fuller, the eccentric inventor who famously created geodesic domes with this soccer ball–like composition. Robert Curl, Harold Kroto, and Richard Smalley won the 1996 Nobel Prize in Chemistry for discovering this new form of carbon.

The youngest member of the carbon family is graphene, found by chance in 2004 by Andre Geim and Kostya Novoselov in an impromptu research jam. The scientists used scotch tape—yes, really—to lift carbon sheets one atom thick from a lump of graphite. The new material is extremely thin and strong. The result: the Nobel Prize in Physics in 2010.

7. DIAMONDS AREN'T CALLED "ICE" BECAUSE OF THEIR APPEARANCE.

Diamonds are called "ice" because their ability to transport heat makes them cool to the touch—not because of their look. This makes them ideal for use as heat sinks in microchips. (Synthethic diamonds are mostly used.) Again, diamonds' three-dimensional lattice structure comes into play. Heat is turned into lattice vibrations, which are responsible for diamonds' very high thermal conductivity.

8. IT HELPS US DETERMINE THE AGE OF ARTIFACTS—AND PROVE SOME OF THEM FAKE.

American scientist Willard F. Libby won the Nobel Prize in Chemistry in 1960 for developing a method for dating relics by analyzing the amount of a radioactive subspecies of carbon contained in them. Radiocarbon or C14 dating measures the decay of a radioactive form of carbon, C14, that accumulates in living things. It can be used for objects that are as much as 50,000 years old. Carbon dating help determine the age of Ötzi the Iceman, a 5300-year-old corpse found frozen in the Alps. It also established that Lancelot's Round Table in Winchester Cathedral was made hundreds of years after the supposed Arthurian Age.

9. TOO MUCH OF IT IS CHANGING OUR WORLD.

Carbon dioxide (CO2) is an important part of a gaseous blanket that is wrapped around our planet, making it warm enough to sustain life. But burning fossil fuels—which are built on a carbon backbone—releases more carbon dioxide, which is directly linked to global warming. A number of ways to remove and store carbon dioxide have been proposed, including bioenergy with carbon capture and storage, which involves planting large stands of trees, harvesting and burning them to create electricity, and capturing the CO2 created in the process and storing it underground. Yet another approach that is being discussed is to artificially make oceans more alkaline in order to let them to bind more CO2. Forests are natural carbon sinks, because trees capture CO2 during photosynthesis, but human activity in these forests counteracts and surpasses whatever CO2 capture gains we might get. In short, we don't have a solution yet to the overabundance of C02 we've created in the atmosphere.

Original image
iStock
arrow
Big Questions
Why Don't We Eat Turkey Tails?
Original image
iStock

Turkey sandwiches. Turkey soup. Roasted turkey. This year, Americans will consume roughly 245 million birds, with 46 million being prepared and presented on Thanksgiving. What we don’t eat will be repurposed into leftovers.

But there’s one part of the turkey that virtually no family will have on their table: the tail.

Despite our country’s obsession with fattening, dissecting, and searing turkeys, we almost inevitably pass up the fat-infused rear portion. According to Michael Carolan, professor of sociology and associate dean for research at the College for Liberal Arts at Colorado State University, that may have something to do with how Americans have traditionally perceived turkeys. Consumption was rare prior to World War II. When the birds were readily available, there was no demand for the tail because it had never been offered in the first place.

"Tails did and do not fit into what has become our culinary fascination with white meat," Carolan tells Mental Floss. "But also from a marketing [and] processor standpoint, if the consumer was just going to throw the tail away, or will not miss it if it was omitted, [suppliers] saw an opportunity to make additional money."

Indeed, the fact that Americans didn't have a taste for tail didn't prevent the poultry industry from moving on. Tails were being routed to Pacific Island consumers in the 1950s. Rich in protein and fat—a turkey tail is really a gland that produces oil used for grooming—suppliers were able to make use of the unwanted portion. And once consumers were exposed to it, they couldn't get enough.

“By 2007,” according to Carolan, “the average Samoan was consuming more than 44 pounds of turkey tails every year.” Perhaps not coincidentally, Samoans also have alarmingly high obesity rates of 75 percent. In an effort to stave off contributing factors, importing tails to the Islands was banned from 2007 until 2013, when it was argued that doing so violated World Trade Organization rules.

With tradition going hand-in-hand with commerce, poultry suppliers don’t really have a reason to try and change domestic consumer appetites for the tails. In preparing his research into the missing treat, Carolan says he had to search high and low before finally finding a source of tails at a Whole Foods that was about to discard them. "[You] can't expect the food to be accepted if people can't even find the piece!"

Unless the meat industry mounts a major campaign to shift American tastes, Thanksgiving will once again be filled with turkeys missing one of their juicier body parts.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios