Original image
Getty Images

What Killed The Dinner Party?

Original image
Getty Images

By Peter Weber

Oh, dinner parties, says Guy Trebay in The New York Times, with more than a hint of wistfulness. "Remember those?" A great dinner party — to celebrate the holidays, or just because — is a pleasant and personable way to network, a great occasion for different ages and social strata to mix, a fount of great conversation, and "the epitome of civilized living." But sadly, "the world is so changed, hardly anyone does them anymore," says Louise Grunwald, the widow of diplomat and TIME editor Henry Anatole Grunwald. Grunwald's "doomful pronouncement" may sound far-fetched, but she's probably right, Trebay laments. "You may want the dinner party to come back, harkening back to another era," Grunwald says. "But it will never happen." So, just what is it that killed the dinner party? A few theories:

1. A breakdown in society — and "society"

Throwing a great dinner party is an art quickly becoming lost as "social lions and lionesses" — spirited socialite Nan Kempner, cabaret standout Bobby Short, director Nora Ephron, and philanthropists Brooke Astor and Judith Peabody, for example — exit this earthly stage. "When I think of all those great hosts and hostesses who were around when I moved to New York" in 1980, says cookbook author Alex Hitz, "many are now gone with the wind." A good host was "trained from birth or on the job" to command their tables like a military tactician, says Trebay. "Naturally they shared other likenesses: Social prominence, deep pockets, commodious apartments, household staffs, and no allergy to drink." But it's not just that "society's elite are throwing fewer parties," says Bethany Seawright at Apartment Therapy. "As a society in general, we are allowing this type of evening to disappear from our personal experience," and that's sad for the "socially-impoverished among us all."

2. The rise of restaurants

As our time gets seemingly ever-more-precious, our tastes get intimidatingly sophisticated, and we fall out of the habit of cooking for ourselves, celebrity-chef and foodie-oriented restaurants are taking the place of the dinner party table. Let's face it, says Trebay: For better or worse, "it's so much easier and more convenient to meet friends in restaurants." Of course, this is nothing new. Trish Hall, also writing in The New York Times, noted — in 1988 — that when would-be hostesses and guests want to socialize, "they go to restaurants or have a small party catered" instead, because "the thought of preparing and serving a meal — an impressive meal that will satisfy increasingly sophisticated palates — is overwhelming." There is a modern twist, though,says Kat Stoeffel at New York. Today, we also have "too many restaurant Groupons to use before they expire/Groupon goes bankrupt."

3. Social media

Websites like Facebook and LinkedIn are replacing face-to-face networking for many people, and smartphones and other handheld devices have been disastrous for the social contract, says etiquette columnist Judith Martin, better known as Miss Manners. "People don't even respond to dinner invitations anymore," she tells The Times. "They consider it too difficult a commitment to say, 'I'll come to dinner a week from Saturday,'" and they think nothing of canceling at the last minute — by text message! And those guests who do show up, says New York's Stoeffel, "will Instagram pictures of our not-good cooking, and everyone will know." And when they post those photos to Facebook or Twitter, "the friends we didn't invite will feel left out."

4. Ignorance

Along with the lost-art aspect, people just don't know the mechanics of dinner parties anymore. That's given rise to a small (probably very small) cottage industry of event planners like David E. Monn who will teach socialites which forks to use and how to mix the perfect cocktail. "People want to be civilized, so it all doesn't turn into Caligula," Monn tells The Times. "So they come to me saying: 'I don't know what to do if I'm having friends over for cocktails. What tray do you use? What do you put on the tray? Do you put out a piece of cheese?'" So if you want to know "whether the curious tongs inherited from Aunt Mabel are meant for serving asparagus, or else flipping a hamburger on the grill," says Trebay, there's help out there.

5. Dietary restrictions

And then there's what Miss Manners calls "food fussing," or the growing list of things people can't (or won't) eat. In the 1970s, vegetarians were considered difficult guests; now, even vegans are relatively easy to accommodate. Nut allergies, gluten intolerance, no-sugar diets, paleo (or cave-man) diets — "it's too hard to plan a menu with everyone's fake allergies and dietary restrictions," says New York's Stoeffel.

6. We don't converse, we pontificate

Dinner parties were never really about the food. After all, "the idea of cooking for others is not something that is going to die," Miss Manners tells The Times. But "conversation is in trouble," and without that main course, a dinner party isn't a dinner party. The problem? "People have been brought up to express themselves rather than to exchange ideas." There were always boors, but back in the dinner party era, says Trebay, a master hostess "orchestrated every element of the evening, arrival to departure, most crucially directing the conversation, which they either allowed to follow a traditional serve-and-volley pattern (20 minutes right, 20 minutes left), or else commandeered for so-called 'general discussion' as provocateur hosts like the television journalist Barbara Walters still do."

...Actually, the dinner party isn't dead at all

Naturally, since Trebay's nostalgic look at a bygone era appeared in the rather highfalutin New York Times Style section, lots of people disagree with the very premise. Dinner parties aren't dead, they've just been appropriated by "hipsters," and more specifically "that hipster hybrid, foodie-hipsters (fipsters? fooipstershoopsters?)," says Jen Doll at The Atlantic Wire. How did The Times get it so wrong? "Perhaps unsurprisingly for a newspaper that has only just discovered Brooklyn," says Kristin Iversen at The L Magazine, Trebay "interviewed people like Louise Grunwald and Judith Peabody who, while lovely people, I'm sure, are not perhaps the trend-setters that they used to be."

Stones, Bones, and Wrecks
A Chinese Museum Is Offering Cash to Whoever Can Decipher These 3000-Year-Old Inscriptions

During the 19th century, farmers in China’s Henan Province began discovering oracle bones—engraved ox scapulae and tortoise shells used by Shang Dynasty leaders for record-keeping and divination purposes—while plowing their fields. More bones were excavated in subsequent years, and their inscriptions were revealed to be the earliest known form of systematic writing in East Asia. But over the decades, scholars still haven’t come close to cracking half of the mysterious script’s roughly 5000 characters—which is why one Chinese museum is asking member of the public for help, in exchange for a generous cash reward.

As Atlas Obscura reports, the National Museum of Chinese Writing in Anyang, Henan Province has offered to pay citizen researchers about $15,000 for each unknown character translated, and $7500 if they provide a disputed character’s definitive meaning. Submissions must be supported with evidence, and reviewed by at least two language specialists.

The museum began farming out their oracle bone translation efforts in Fall 2016. The costly ongoing project has hit a stalemate, and scholars hope that the public’s collective smarts—combined with new advances in technology, including cloud computing and big data—will yield new information and save them research money.

As of today, more than 200,000 oracle bones have been discovered—around 50,000 of which bear text—so scholars still have a lot to learn about the Shang Dynasty. Many of the ancient script's characters are difficult to verify, as they represent places and people from long ago. However, decoding even just one character could lead to a substantial breakthrough, experts say: "If we interpret a noun or a verb, it can bring many scripts on oracle bones to life, and we can understand ancient history better,” Chinese history professor Zhu Yanmin told the South China Morning Post.

[h/t Atlas Obscura]

6 Eponyms Named After the Wrong Person
Original image
Salmonella species growing on agar.

Having something named after you is the ultimate accomplishment for any inventor, mathematician, scientist, or researcher. Unfortunately, the credit for an invention or discovery does not always go to the correct person—senior colleagues sometimes snatch the glory, fakers pull the wool over people's eyes, or the fickle general public just latches onto the wrong name.


In 1885, while investigating common livestock diseases at the Bureau of Animal Industry in Washington, D.C., pathologist Theobald Smith first isolated the salmonella bacteria in pigs suffering from hog cholera. Smith’s research finally identified the bacteria responsible for one of the most common causes of food poisoning in humans. Unfortunately, Smith’s limelight-grabbing supervisor, Daniel E. Salmon, insisted on taking sole credit for the discovery. As a result, the bacteria was named after him. Don’t feel too sorry for Theobald Smith, though: He soon emerged from Salmon’s shadow, going on to make the important discovery that ticks could be a vector in the spread of disease, among other achievements.


An etching of Amerigo Vespucci
Henry Guttmann/Getty Images

Florentine explorer Amerigo Vespucci (1451–1512) claimed to have made numerous voyages to the New World, the first in 1497, before Columbus. Textual evidence suggests Vespucci did take part in a number of expeditions across the Atlantic, but generally does not support the idea that he set eyes on the New World before Columbus. Nevertheless, Vespucci’s accounts of his voyages—which today read as far-fetched—were hugely popular and translated into many languages. As a result, when German cartographer Martin Waldseemüller was drawing his map of the Novus Mundi (or New World) in 1507 he marked it with the name "America" in Vespucci’s honor. He later regretted the choice, omitting the name from future maps, but it was too late, and the name stuck.


A black and white image of young women wearing bloomers
Hulton Archive/Getty Images

Dress reform became a big issue in mid-19th century America, when women were restricted by long, heavy skirts that dragged in the mud and made any sort of physical activity difficult. Women’s rights activist Elizabeth Smith Miller was inspired by traditional Turkish dress to begin wearing loose trousers gathered at the ankle underneath a shorter skirt. Miller’s new outfit immediately caused a splash, with some decrying it as scandalous and others inspired to adopt the garb.

Amelia Jenks Bloomer was editor of the women’s temperance journal The Lily, and she took to copying Miller’s style of dress. She was so impressed with the new freedom it gave her that she began promoting the “reform dress” in her magazine, printing patterns so others might make their own. Bloomer sported the dress when she spoke at events and soon the press began to associate the outfit with her, dubbing it “Bloomer’s costume.” The name stuck.


Execution machines had been known prior to the French Revolution, but they were refined after Paris physician and politician Dr. Joseph-Ignace Guillotin suggested they might be a more humane form of execution than the usual methods (hanging, burning alive, etc.). The first guillotine was actually designed by Dr. Antoine Louis, Secretary of the Academy of Surgery, and was known as a louisette. The quick and efficient machine was quickly adopted as the main method of execution in revolutionary France, and as the bodies piled up the public began to refer to it as la guillotine, for the man who first suggested its use. Guillotin was very distressed at the association, and when he died in 1814 his family asked the French government to change the name of the hated machine. The government refused and so the family changed their name instead to escape the dreadful association.


Alison Bechdel
Alison Bechdel
Steve Jennings/Getty Images

The Bechdel Test is a tool to highlight gender inequality in film, television, and fiction. The idea is that in order to pass the test, the movie, show, or book in question must include at least one scene in which two women have a conversation that isn’t about a man. The test was popularized by the cartoonist Alison Bechdel in 1985 in her comic strip “Dykes to Watch Out For,” and has since become known by her name. However, Bechdel asserts that the idea originated with her friend Lisa Wallace (and was also inspired by the writer Virginia Woolf), and she would prefer for it to be known as the Bechdel-Wallace test.


Influential sociologist Robert K. Merton suggested the idea of the “Matthew Effect” in a 1968 paper noting that senior colleagues who are already famous tend to get the credit for their junior colleagues’ discoveries. (Merton named his phenomenon [PDF] after the parable of talents in the Gospel of Matthew, in which wise servants invest money their master has given them.)

Merton was a well-respected academic, and when he was due to retire in 1979, a book of essays celebrating his work was proposed. One person who contributed an essay was University of Chicago professor of statistics Stephen Stigler, who had corresponded with Merton about his ideas. Stigler decided to pen an essay that celebrated and proved Merton’s theory. As a result, he took Merton’s idea and created Stigler’s Law of Eponymy, which states that “No scientific discovery is named after its original discoverer”—the joke being that Stigler himself was taking Merton’s own theory and naming it after himself. To further prove the rule, the “new” law has been adopted by the academic community, and a number of papers and articles have since been written on "Stigler’s Law."


More from mental floss studios