Original image

15 Words With Origins So Obvious You Never Noticed Them

Original image

The red car is fast, the blue one is faster, and green is the fastest. As you may recall from grammar class, these three words make up the positive, comparative, and superlative forms of the word fast. But there is a handful of words in English so common that we’ve forgotten they were formed using these -er and -est suffixes. Like upper, which literally means “more up” (up plus -er). But this word has become so familiar that we no longer think of it as the comparative of up. Here are 15 other such words whose origins are hiding right under your nose.


Etymologically speaking, latter just means “more late.” It comes from the Old English lætra (slower), the comparative form of læt, (slow) and source of late. Lætra also carried the sense of our modern later, but the latter word didn’t actually emerge until the 1500s.


And if you’re “most late”? You’re last. Last is the superlative form of læt. Way back, last was latost, and was worn down over the years to last.


Last isn’t always least, as they say, but both words are superlatives. Leastlǽsest in Old English and meaning smallest—is the superlative of lǽs, itself a comparative meaning smaller.


The Old English lǽs gives us less. But if we’re sticklers, lesser is technically a double comparative: “more smaller.” Legendary lexicographer Samuel Johnson couldn’t care less about lesser, calling it “a barbarous corruption of less, formed by the vulgar from the habit of terminating comparatives in -er."


Inner may just make you facepalm: It’s “more in.” The Old English comparative of in (or inne) was innera; the superlative was innemost, now inmost. With use and time, inner became its own positive form, now taking inmost and innermost as its comparative and superlative forms, respectively.

Now, most comparative forms are followed by than, as in "the green car is faster than the blue one." Curiously, inner stopped doing this in Middle English. For instance, we don’t say "the kitchen is inner than the foyer."


The end is nigh: we usually explain that crusty-sounding nigh as near. But near already means “more nigh”: In Old English, near was the comparative of nēah, now nigh. Over the centuries, near went off on its own and is now no longer felt as a comparative, just like inner. And so near is the modern nigh after all.


As for the superlative of nēah? That would be nēahst, “most nigh,” which we now call next. But why the x? In Old English, the h in nēahst would have sounded something like the ch in Scottish loch. Follow that guttural consonant with an s and you eventually get the x sound.


Historically, utter is just outer, or “more out.” Old English had the word út, meaning "out." Its comparative was úterra. By the early 1400s, utter had shifted to its modern meaning of absolute. Also emerging by this time was the verb utter, literally “to put out (goods, money, statements),” in part influenced by the adjective utter.


When utter moved on and lost its association with út/out, it left a gap in the language. Outer, meaning “more out” and formed by analogy with inner, naturally filled it. But like inner, outer has become its own positive form, taking outmost and outermost as its comparative and superlative. The original superlative of út/out would have been utmost, which moved on to mean extreme.


At this point you may be wondering, is further …“more furth”? Yes, it’s just that we’d recognize furth as forth today. This makes further “more forth” or “more fore.” And all that business about reserving farther for physical distances and further for abstract ones? Cockamamie. Farther began as a variant of further—and both of them ousted the normal comparative of far, which was simply farrer. Oh, could English go any farther, er, further to make things complicated?


If further corresponds to “more fore,” then what about “most fore”? That would be first, the “fore-est,” if we gloss over some vowel changes that happened in English long, long ago. It’s a sensible construction: That which comes before everything else is indeed first. Today, "fore-est" answers to foremost.


Everything else is after what comes first. Is after “more aft,” then? Not quite. It’s more like “more off” in the sense of “farther behind” or “more away.” The af- in after corresponds to off (as well as of), the -ter to an ancient comparative suffix. But we’d have to travel back thousands of years before we’d find any speaker would register after as a comparative.


It’s rare now, but English once had the adjective rathe, meaning quick or eager. (We might think of being rathe as the opposite of being loath.) So, if you are “more rathe”? You’re rather. If you’d rather watch paint dry than finish this article, you’d “more readily” do so. Or, rather, you’re the type who finds this arcane trivia edifying. This adverbial rather has the sense of “more properly”; you more truly do something if you carry it out willingly. And if you have your druthers, or preference, you have playfully contracted I would rather.


Literally speaking, your elder is just someone older than you—but you better not say that your grandparents. Elder and older are both comparatives of the Old English (e)aldOlder did not show respect for its elder, supplanting it as the common comparative around the 1500s. The Old English ald, meanwhile, hangs on in the auld of “Auld Lang Syne” (for Old Times’ Sake) and alderman.


Finally, erstwhile is a snazzy word for former, often seen in the expression erstwhile enemies. But what is the erst in erstwhile, anyway? Old English had ǽr (soon, before), which you’ll recognize as ere from your Shakespeare. Its superlative was ǽrest, or “most ere,” hence erst.

Stones, Bones, and Wrecks
A Chinese Museum Is Offering Cash to Whoever Can Decipher These 3000-Year-Old Inscriptions

During the 19th century, farmers in China’s Henan Province began discovering oracle bones—engraved ox scapulae and tortoise shells used by Shang Dynasty leaders for record-keeping and divination purposes—while plowing their fields. More bones were excavated in subsequent years, and their inscriptions were revealed to be the earliest known form of systematic writing in East Asia. But over the decades, scholars still haven’t come close to cracking half of the mysterious script’s roughly 5000 characters—which is why one Chinese museum is asking member of the public for help, in exchange for a generous cash reward.

As Atlas Obscura reports, the National Museum of Chinese Writing in Anyang, Henan Province has offered to pay citizen researchers about $15,000 for each unknown character translated, and $7500 if they provide a disputed character’s definitive meaning. Submissions must be supported with evidence, and reviewed by at least two language specialists.

The museum began farming out their oracle bone translation efforts in Fall 2016. The costly ongoing project has hit a stalemate, and scholars hope that the public’s collective smarts—combined with new advances in technology, including cloud computing and big data—will yield new information and save them research money.

As of today, more than 200,000 oracle bones have been discovered—around 50,000 of which bear text—so scholars still have a lot to learn about the Shang Dynasty. Many of the ancient script's characters are difficult to verify, as they represent places and people from long ago. However, decoding even just one character could lead to a substantial breakthrough, experts say: "If we interpret a noun or a verb, it can bring many scripts on oracle bones to life, and we can understand ancient history better,” Chinese history professor Zhu Yanmin told the South China Morning Post.

[h/t Atlas Obscura]

6 Eponyms Named After the Wrong Person
Original image
Salmonella species growing on agar.

Having something named after you is the ultimate accomplishment for any inventor, mathematician, scientist, or researcher. Unfortunately, the credit for an invention or discovery does not always go to the correct person—senior colleagues sometimes snatch the glory, fakers pull the wool over people's eyes, or the fickle general public just latches onto the wrong name.


In 1885, while investigating common livestock diseases at the Bureau of Animal Industry in Washington, D.C., pathologist Theobald Smith first isolated the salmonella bacteria in pigs suffering from hog cholera. Smith’s research finally identified the bacteria responsible for one of the most common causes of food poisoning in humans. Unfortunately, Smith’s limelight-grabbing supervisor, Daniel E. Salmon, insisted on taking sole credit for the discovery. As a result, the bacteria was named after him. Don’t feel too sorry for Theobald Smith, though: He soon emerged from Salmon’s shadow, going on to make the important discovery that ticks could be a vector in the spread of disease, among other achievements.


An etching of Amerigo Vespucci
Henry Guttmann/Getty Images

Florentine explorer Amerigo Vespucci (1451–1512) claimed to have made numerous voyages to the New World, the first in 1497, before Columbus. Textual evidence suggests Vespucci did take part in a number of expeditions across the Atlantic, but generally does not support the idea that he set eyes on the New World before Columbus. Nevertheless, Vespucci’s accounts of his voyages—which today read as far-fetched—were hugely popular and translated into many languages. As a result, when German cartographer Martin Waldseemüller was drawing his map of the Novus Mundi (or New World) in 1507 he marked it with the name "America" in Vespucci’s honor. He later regretted the choice, omitting the name from future maps, but it was too late, and the name stuck.


A black and white image of young women wearing bloomers
Hulton Archive/Getty Images

Dress reform became a big issue in mid-19th century America, when women were restricted by long, heavy skirts that dragged in the mud and made any sort of physical activity difficult. Women’s rights activist Elizabeth Smith Miller was inspired by traditional Turkish dress to begin wearing loose trousers gathered at the ankle underneath a shorter skirt. Miller’s new outfit immediately caused a splash, with some decrying it as scandalous and others inspired to adopt the garb.

Amelia Jenks Bloomer was editor of the women’s temperance journal The Lily, and she took to copying Miller’s style of dress. She was so impressed with the new freedom it gave her that she began promoting the “reform dress” in her magazine, printing patterns so others might make their own. Bloomer sported the dress when she spoke at events and soon the press began to associate the outfit with her, dubbing it “Bloomer’s costume.” The name stuck.


Execution machines had been known prior to the French Revolution, but they were refined after Paris physician and politician Dr. Joseph-Ignace Guillotin suggested they might be a more humane form of execution than the usual methods (hanging, burning alive, etc.). The first guillotine was actually designed by Dr. Antoine Louis, Secretary of the Academy of Surgery, and was known as a louisette. The quick and efficient machine was quickly adopted as the main method of execution in revolutionary France, and as the bodies piled up the public began to refer to it as la guillotine, for the man who first suggested its use. Guillotin was very distressed at the association, and when he died in 1814 his family asked the French government to change the name of the hated machine. The government refused and so the family changed their name instead to escape the dreadful association.


Alison Bechdel
Alison Bechdel
Steve Jennings/Getty Images

The Bechdel Test is a tool to highlight gender inequality in film, television, and fiction. The idea is that in order to pass the test, the movie, show, or book in question must include at least one scene in which two women have a conversation that isn’t about a man. The test was popularized by the cartoonist Alison Bechdel in 1985 in her comic strip “Dykes to Watch Out For,” and has since become known by her name. However, Bechdel asserts that the idea originated with her friend Lisa Wallace (and was also inspired by the writer Virginia Woolf), and she would prefer for it to be known as the Bechdel-Wallace test.


Influential sociologist Robert K. Merton suggested the idea of the “Matthew Effect” in a 1968 paper noting that senior colleagues who are already famous tend to get the credit for their junior colleagues’ discoveries. (Merton named his phenomenon [PDF] after the parable of talents in the Gospel of Matthew, in which wise servants invest money their master has given them.)

Merton was a well-respected academic, and when he was due to retire in 1979, a book of essays celebrating his work was proposed. One person who contributed an essay was University of Chicago professor of statistics Stephen Stigler, who had corresponded with Merton about his ideas. Stigler decided to pen an essay that celebrated and proved Merton’s theory. As a result, he took Merton’s idea and created Stigler’s Law of Eponymy, which states that “No scientific discovery is named after its original discoverer”—the joke being that Stigler himself was taking Merton’s own theory and naming it after himself. To further prove the rule, the “new” law has been adopted by the academic community, and a number of papers and articles have since been written on "Stigler’s Law."


More from mental floss studios