Original image
Library of Congress

The Peace Palace Opens

Original image
Library of Congress

The First World War was an unprecedented catastrophe that killed millions and set the continent of Europe on the path to further calamity two decades later. But it didn’t come out of nowhere. With the centennial of the outbreak of hostilities coming up in 2014, Erik Sass will be looking back at the lead-up to the war, when seemingly minor moments of friction accumulated until the situation was ready to explode. He'll be covering those events 100 years after they occurred. This is the 83rd installment in the series. 

August 28, 1913: Peace Palace Opens

The story of the Great War is filled with ironies: the fact that an intricate alliance system meant to keep the peace instead plunged the world into chaos; that decades of military planning left all Europe’s Great Powers completely unprepared for the conflict; that empires which fought to stem the tide of change hurried it instead, bringing about their own collapse. But perhaps the greatest irony of the Great War is that it occurred at a time when the civilized world seemed to have banished war forever.

The first years of the 20th century were a time of great optimism, fueled by the undeniable progress of European civilization and belief in science and technology. Disease and malnutrition were in retreat, travel and communication were easier than ever, and Europeans directed the affairs of most of the planet with a patronizing sense of “duty” to the “lesser races.” Amid all these triumphs of “Reason” (frequently capitalized) it wasn’t unreasonable to believe humanity might also be freed from the terrible, irrational suffering and waste of war. 

This was more than just a hope: It was “proved,” with typical confidence, by social scientists and pundits like Norman Angell, a British economist and member of the Labour Party, who in his book The Great Illusion cited the complex connections between industrial states in areas like trade and finance to argue that a major war would simply be too disruptive to the modern, interdependent global economy. A European war would cut Germany off from British finance, and Britain off from continental markets, leading to total economic collapse; therefore neither country (nor their allies) could afford to start a fight. 

Kurt Riezler, a German philosopher and diplomat who wielded a great deal of influence as foreign policy advisor to Chancellor Bethmann-Hollweg, argued something similar in his book The Fundamental Features of Contemporary Geopolitics, published in 1914, just prior to the war. Riezler observed that “the world has become a [single] politically unified area,” as nations were drawn together by interlocking economic interests. At the same time, the destructive capacities of modern weaponry meant war would result in “political and financial ruin.” Therefore armed struggle was an “outdated form of conflict"; future wars would instead be “calculated” around a negotiating table, rather than fought out on battlefields, thus sparing everyone the misery of actual bloodshed.

Negotiation and compromise were central to Angell and Riezler’s visions of a world without war—and the world seemed to be taking steps in that direction with the creation of new, international institutions dedicated to the peaceful resolution of conflicts. August 28, 1913, saw the opening of the Peace Palace in The Hague, Netherlands, to house some of these promising new institutions.

The Peace Palace was built with generous support from Andrew Carnegie, the Scottish-American industrialist, philanthropist and peace activist, as a home for The Permanent Court of Arbitration—an international tribunal agreed in a treaty signed at the First Hague Peace Conference in 1899 (convened at the behest of Tsar Nicholas II with the goal of reducing armaments and preventing war through mediation).

Participation in the tribunal was strictly voluntary, so its value was more symbolic than anything else—but in an idealistic age, this still mattered. A bit weirdly, the Palace was originally supposed to be the central feature of “city of world peace,” a sort of proto-world capital, sketched out for the beach near The Hague by the Dutch spiritualist and pacifist Paul Horrix; the somewhat impractical design produced for Horrix by the architect K.P.C. de Bazel, but never built, called for a circular city with streets radiating out from the Peace Palace at the center.

At Carnegie’s insistence, the Peace Palace was also home to an extensive library of international law. Meanwhile several more international courts were proposed at the Second Peace Conference in 1907 but never agreed upon; the war intervened before the Third Peace Conference, scheduled for 1915, could take place. In subsequent years the Peace Palace also became home to the League of Nations’ Permanent Court of International Justice, added in 1922; the Hague Academy of International Law, added in 1923; and the International Court of Justice, formed by the United Nations to replace the Permanent Court of International Justice in 1946.

But as demonstrated by the rocky history of these institutions, the vision of a world ruled by Reason, with peace maintained by international institutions, remains more a dream than anything else. Despite a lukewarm suggestion from Tsar Nicholas II, the Peace Palace sat unused during the July Crisis of 1914; after the First World War the League of Nations was most notable for its failure to prevent the Second; and the United Nations has for the most part proved sadly impotent in the face of wars, civil wars, and genocide. The international rules of war, agreed to at The Hague Peace Conference in 1899, have also been routinely flouted. 

See the previous installment or all entries.

Original image
20th Century Fox
James Cameron is Making a Documentary to Reassess the Accuracy of Titanic
Original image
20th Century Fox

While making the 1997 blockbuster Titanic, James Cameron was a stickler for the details. The writer-director wanted his homage to the tragic ocean liner to be as historically accurate as possible, so he organized dives to the site, solicited experts to analyze his script, and modeled the set off photographs and plans from the Titanic's builders. He even recreated the ocean liner’s original furnishings, right down to the light fixtures. Now, 20 years after the film’s release, E! News reports that Cameron will scrutinize the film’s authenticity in an upcoming National Geographic documentary.

Titanic: 20th Anniversary is slated to air in December 2017. It will feature Cameron and a team of experts who, together, will evaluate the film's accuracy using new historical and scientific insights about the ship's fateful sinking on April 15, 1912.

"When I wrote the film, and when I set out to direct it, I wanted every detail to be as accurate as I could make it, and every harrowing moment of the ship's final hours accounted for," Cameron said in a statement. "I was creating a living history; I had to get it right out of respect for the many who died and for their legacy. But did I really get it right? Now, with National Geographic and with the latest research, science, and technology, I'm going to reassess."

It's not the first time Cameron has revisited his Oscar-winning epic; in 2012, the director made some tweaks to the film for its 3-D re-release after receiving some criticism from renowned astrophysicist Neil deGrasse Tyson.

“Neil deGrasse Tyson sent me quite a snarky email saying that, at that time of year, in that position in the Atlantic in 1912, when Rose is lying on the piece of driftwood and staring up at the stars, that is not the star field she would have seen," Cameron explained. “And with my reputation as a perfectionist, I should have known that and I should have put the right star field in." So he changed it.

In the case of Titanic: 20th Anniversary, Cameron and his team will give viewers an updated interpretation of the Titanic’s sinking, and reexamine the wreck using new underwater footage, computer-generated simulation, and research. They’ll also scrutinize some of the film’s most famous scenes, and provide biographical context about the filming process.

We’re sure fans, historians, and, of course, Kate and Leo, will approve.

[h/t Mashable]

6 Eponyms Named After the Wrong Person
Original image
Salmonella species growing on agar.

Having something named after you is the ultimate accomplishment for any inventor, mathematician, scientist, or researcher. Unfortunately, the credit for an invention or discovery does not always go to the correct person—senior colleagues sometimes snatch the glory, fakers pull the wool over people's eyes, or the fickle general public just latches onto the wrong name.


In 1885, while investigating common livestock diseases at the Bureau of Animal Industry in Washington, D.C., pathologist Theobald Smith first isolated the salmonella bacteria in pigs suffering from hog cholera. Smith’s research finally identified the bacteria responsible for one of the most common causes of food poisoning in humans. Unfortunately, Smith’s limelight-grabbing supervisor, Daniel E. Salmon, insisted on taking sole credit for the discovery. As a result, the bacteria was named after him. Don’t feel too sorry for Theobald Smith, though: He soon emerged from Salmon’s shadow, going on to make the important discovery that ticks could be a vector in the spread of disease, among other achievements.


An etching of Amerigo Vespucci
Henry Guttmann/Getty Images

Florentine explorer Amerigo Vespucci (1451–1512) claimed to have made numerous voyages to the New World, the first in 1497, before Columbus. Textual evidence suggests Vespucci did take part in a number of expeditions across the Atlantic, but generally does not support the idea that he set eyes on the New World before Columbus. Nevertheless, Vespucci’s accounts of his voyages—which today read as far-fetched—were hugely popular and translated into many languages. As a result, when German cartographer Martin Waldseemüller was drawing his map of the Novus Mundi (or New World) in 1507 he marked it with the name "America" in Vespucci’s honor. He later regretted the choice, omitting the name from future maps, but it was too late, and the name stuck.


A black and white image of young women wearing bloomers
Hulton Archive/Getty Images

Dress reform became a big issue in mid-19th century America, when women were restricted by long, heavy skirts that dragged in the mud and made any sort of physical activity difficult. Women’s rights activist Elizabeth Smith Miller was inspired by traditional Turkish dress to begin wearing loose trousers gathered at the ankle underneath a shorter skirt. Miller’s new outfit immediately caused a splash, with some decrying it as scandalous and others inspired to adopt the garb.

Amelia Jenks Bloomer was editor of the women’s temperance journal The Lily, and she took to copying Miller’s style of dress. She was so impressed with the new freedom it gave her that she began promoting the “reform dress” in her magazine, printing patterns so others might make their own. Bloomer sported the dress when she spoke at events and soon the press began to associate the outfit with her, dubbing it “Bloomer’s costume.” The name stuck.


Execution machines had been known prior to the French Revolution, but they were refined after Paris physician and politician Dr. Joseph-Ignace Guillotin suggested they might be a more humane form of execution than the usual methods (hanging, burning alive, etc.). The first guillotine was actually designed by Dr. Antoine Louis, Secretary of the Academy of Surgery, and was known as a louisette. The quick and efficient machine was quickly adopted as the main method of execution in revolutionary France, and as the bodies piled up the public began to refer to it as la guillotine, for the man who first suggested its use. Guillotin was very distressed at the association, and when he died in 1814 his family asked the French government to change the name of the hated machine. The government refused and so the family changed their name instead to escape the dreadful association.


Alison Bechdel
Alison Bechdel
Steve Jennings/Getty Images

The Bechdel Test is a tool to highlight gender inequality in film, television, and fiction. The idea is that in order to pass the test, the movie, show, or book in question must include at least one scene in which two women have a conversation that isn’t about a man. The test was popularized by the cartoonist Alison Bechdel in 1985 in her comic strip “Dykes to Watch Out For,” and has since become known by her name. However, Bechdel asserts that the idea originated with her friend Lisa Wallace (and was also inspired by the writer Virginia Woolf), and she would prefer for it to be known as the Bechdel-Wallace test.


Influential sociologist Robert K. Merton suggested the idea of the “Matthew Effect” in a 1968 paper noting that senior colleagues who are already famous tend to get the credit for their junior colleagues’ discoveries. (Merton named his phenomenon [PDF] after the parable of talents in the Gospel of Matthew, in which wise servants invest money their master has given them.)

Merton was a well-respected academic, and when he was due to retire in 1979, a book of essays celebrating his work was proposed. One person who contributed an essay was University of Chicago professor of statistics Stephen Stigler, who had corresponded with Merton about his ideas. Stigler decided to pen an essay that celebrated and proved Merton’s theory. As a result, he took Merton’s idea and created Stigler’s Law of Eponymy, which states that “No scientific discovery is named after its original discoverer”—the joke being that Stigler himself was taking Merton’s own theory and naming it after himself. To further prove the rule, the “new” law has been adopted by the academic community, and a number of papers and articles have since been written on "Stigler’s Law."


More from mental floss studios