CLOSE
Original image
Alfred Edward Chalon

Ada Lovelace: The First Computer Programmer

Original image
Alfred Edward Chalon

Ada Lovelace has been called the world's first computer programmer. What she did was write the world’s first machine algorithm for an early computing machine that existed only on paper. Of course, someone had to be the first, but Lovelace was a woman, and this was in the 1840s. Lovelace was a brilliant mathematician, thanks in part to opportunities that were denied most women of the time.

Ada Byron was a teenager when she met Cambridge mathematics professor Charles Babbage, who had invented the Difference Engine, a mechanical computer designed to produce mathematical tables automatically and error-free. Babbage never built the actual machine due to personal setbacks and financing difficulty. By 1834 he had moved on to design his Analytical Engine, the first general purpose computer, which used punch cards for input and output. This machine also lacked financing and was never built. (Babbage's Difference Engine was finally constructed in 1985–2002, and it worked.)

An original model of part of the Analytical Engine. Photograph by Bruno Barral (ByB)

Babbage was impressed with the brilliant young woman, and they corresponded for years, discussing math and computing as he developed the Analytical Engine. In 1842, Babbage gave a lecture on the engine at the University of Turin. Luigi Menabrea, a mathematician (and future Italian prime minister), transcribed the lecture in French. Ada, now in her late 20s and known as Countess of Lovelace, was commissioned to translate the transcript into English. Lovelace added her own notes to the lecture, which ended up being three times as long as the actual transcript. It was published in 1843

Lovelace's notes made it clear that she understood the Analytical Engine as well as Babbage himself, and furthermore, she understood how to make it do the things computers do. She suggested the data input that would program the machine to calculate Bernoulli numbers, which is now considered the first computer program. But more than that, Lovelace was a visionary: she understood that numbers could be used to represent more than just quantities, and a machine that could manipulate numbers could be made to manipulate any data represented by numbers. She predicted that machines like the Analytical Engine could be used to compose music, produce graphics, and be useful to science. Of course, all that came true—in another 100 years. 

Babbage was so impressed with Lovelace's contributions, he dubbed her "The Enchantress of Numbers."

How did a young woman get the opportunity to show the world her talents in the 19th century? Mathematical intelligence was not the only thing Ada Lovelace had going for her. Her potential for intelligence probably came genetically, as she was the daughter of the poet Lord Byron and his first wife Anne Isabella Noel Byron. Both were privileged members of the aristocracy, and both were gifted and well educated. The marriage broke up shortly after Ada was born.

Lady Byron, who studied literature, science, philosophy, and, most unusual for a woman, mathematics, was determined that Ada not follow in her father's footsteps. Instead of art and literature, Ada was tutored in mathematics and science. Ada excelled in all her studies, and her interests were wide ranging. Ada became a baroness in 1835 when she married William King, 8th Baron King; the two had three children. In 1838, she became Countess of Lovelace when her husband was elevated to Earl of Lovelace. Her pedigree and peerage alone would have landed Lovelace in the history books, but her accomplishments in mathematics made her a pioneer of not only computing, but of women in science.

Lovelace died of cancer in 1852, when she was only 36. More than 150 years later, we remember her contributions to science and engineering in the celebration of Ada Lovelace Day on October 13. First celebrated in 2009 (in March), it is a day set aside to learn about women in science, technology, engineering, and mathematics. 

Original image
20th Century Fox
arrow
History
James Cameron is Making a Documentary to Reassess the Accuracy of Titanic
Original image
20th Century Fox

While making the 1997 blockbuster Titanic, James Cameron was a stickler for the details. The writer-director wanted his homage to the tragic ocean liner to be as historically accurate as possible, so he organized dives to the site, solicited experts to analyze his script, and modeled the set off photographs and plans from the Titanic's builders. He even recreated the ocean liner’s original furnishings, right down to the light fixtures. Now, 20 years after the film’s release, E! News reports that Cameron will scrutinize the film’s authenticity in an upcoming National Geographic documentary.

Titanic: 20th Anniversary is slated to air in December 2017. It will feature Cameron and a team of experts who, together, will evaluate the film's accuracy using new historical and scientific insights about the ship's fateful sinking on April 15, 1912.

"When I wrote the film, and when I set out to direct it, I wanted every detail to be as accurate as I could make it, and every harrowing moment of the ship's final hours accounted for," Cameron said in a statement. "I was creating a living history; I had to get it right out of respect for the many who died and for their legacy. But did I really get it right? Now, with National Geographic and with the latest research, science, and technology, I'm going to reassess."

It's not the first time Cameron has revisited his Oscar-winning epic; in 2012, the director made some tweaks to the film for its 3-D re-release after receiving some criticism from renowned astrophysicist Neil deGrasse Tyson.

“Neil deGrasse Tyson sent me quite a snarky email saying that, at that time of year, in that position in the Atlantic in 1912, when Rose is lying on the piece of driftwood and staring up at the stars, that is not the star field she would have seen," Cameron explained. “And with my reputation as a perfectionist, I should have known that and I should have put the right star field in." So he changed it.

In the case of Titanic: 20th Anniversary, Cameron and his team will give viewers an updated interpretation of the Titanic’s sinking, and reexamine the wreck using new underwater footage, computer-generated simulation, and research. They’ll also scrutinize some of the film’s most famous scenes, and provide biographical context about the filming process.

We’re sure fans, historians, and, of course, Kate and Leo, will approve.

[h/t Mashable]

arrow
language
6 Eponyms Named After the Wrong Person
Original image
Salmonella species growing on agar.

Having something named after you is the ultimate accomplishment for any inventor, mathematician, scientist, or researcher. Unfortunately, the credit for an invention or discovery does not always go to the correct person—senior colleagues sometimes snatch the glory, fakers pull the wool over people's eyes, or the fickle general public just latches onto the wrong name.

1. SALMONELLA (OR SMITHELLA?)

In 1885, while investigating common livestock diseases at the Bureau of Animal Industry in Washington, D.C., pathologist Theobald Smith first isolated the salmonella bacteria in pigs suffering from hog cholera. Smith’s research finally identified the bacteria responsible for one of the most common causes of food poisoning in humans. Unfortunately, Smith’s limelight-grabbing supervisor, Daniel E. Salmon, insisted on taking sole credit for the discovery. As a result, the bacteria was named after him. Don’t feel too sorry for Theobald Smith, though: He soon emerged from Salmon’s shadow, going on to make the important discovery that ticks could be a vector in the spread of disease, among other achievements.

2. AMERICA (OR COLUMBIANA?)

An etching of Amerigo Vespucci
Henry Guttmann/Getty Images

Florentine explorer Amerigo Vespucci (1451–1512) claimed to have made numerous voyages to the New World, the first in 1497, before Columbus. Textual evidence suggests Vespucci did take part in a number of expeditions across the Atlantic, but generally does not support the idea that he set eyes on the New World before Columbus. Nevertheless, Vespucci’s accounts of his voyages—which today read as far-fetched—were hugely popular and translated into many languages. As a result, when German cartographer Martin Waldseemüller was drawing his map of the Novus Mundi (or New World) in 1507 he marked it with the name "America" in Vespucci’s honor. He later regretted the choice, omitting the name from future maps, but it was too late, and the name stuck.

3. BLOOMERS (OR MILLERS?)

A black and white image of young women wearing bloomers
Hulton Archive/Getty Images

Dress reform became a big issue in mid-19th century America, when women were restricted by long, heavy skirts that dragged in the mud and made any sort of physical activity difficult. Women’s rights activist Elizabeth Smith Miller was inspired by traditional Turkish dress to begin wearing loose trousers gathered at the ankle underneath a shorter skirt. Miller’s new outfit immediately caused a splash, with some decrying it as scandalous and others inspired to adopt the garb.

Amelia Jenks Bloomer was editor of the women’s temperance journal The Lily, and she took to copying Miller’s style of dress. She was so impressed with the new freedom it gave her that she began promoting the “reform dress” in her magazine, printing patterns so others might make their own. Bloomer sported the dress when she spoke at events and soon the press began to associate the outfit with her, dubbing it “Bloomer’s costume.” The name stuck.

4. GUILLOTINE (OR LOUISETTE?)

Execution machines had been known prior to the French Revolution, but they were refined after Paris physician and politician Dr. Joseph-Ignace Guillotin suggested they might be a more humane form of execution than the usual methods (hanging, burning alive, etc.). The first guillotine was actually designed by Dr. Antoine Louis, Secretary of the Academy of Surgery, and was known as a louisette. The quick and efficient machine was quickly adopted as the main method of execution in revolutionary France, and as the bodies piled up the public began to refer to it as la guillotine, for the man who first suggested its use. Guillotin was very distressed at the association, and when he died in 1814 his family asked the French government to change the name of the hated machine. The government refused and so the family changed their name instead to escape the dreadful association.

5. BECHDEL TEST (OR WALLACE TEST?)

Alison Bechdel
Alison Bechdel
Steve Jennings/Getty Images

The Bechdel Test is a tool to highlight gender inequality in film, television, and fiction. The idea is that in order to pass the test, the movie, show, or book in question must include at least one scene in which two women have a conversation that isn’t about a man. The test was popularized by the cartoonist Alison Bechdel in 1985 in her comic strip “Dykes to Watch Out For,” and has since become known by her name. However, Bechdel asserts that the idea originated with her friend Lisa Wallace (and was also inspired by the writer Virginia Woolf), and she would prefer for it to be known as the Bechdel-Wallace test.

6. STIGLER’S LAW OF EPONYMY (OR MERTON’S LAW?)

Influential sociologist Robert K. Merton suggested the idea of the “Matthew Effect” in a 1968 paper noting that senior colleagues who are already famous tend to get the credit for their junior colleagues’ discoveries. (Merton named his phenomenon [PDF] after the parable of talents in the Gospel of Matthew, in which wise servants invest money their master has given them.)

Merton was a well-respected academic, and when he was due to retire in 1979, a book of essays celebrating his work was proposed. One person who contributed an essay was University of Chicago professor of statistics Stephen Stigler, who had corresponded with Merton about his ideas. Stigler decided to pen an essay that celebrated and proved Merton’s theory. As a result, he took Merton’s idea and created Stigler’s Law of Eponymy, which states that “No scientific discovery is named after its original discoverer”—the joke being that Stigler himself was taking Merton’s own theory and naming it after himself. To further prove the rule, the “new” law has been adopted by the academic community, and a number of papers and articles have since been written on "Stigler’s Law."

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios