“Mirrors With Memories”: Why Did Victorians Take Pictures of Dead People?

Emil, Mary, and Anna Keller, 1894 murder-suicide, via the Thanatos Archive
Emil, Mary, and Anna Keller, 1894 murder-suicide, via the Thanatos Archive

“Secure the shadow, ere the substance fades.” That very early photographers’ slogan—introduced not long after Louis Daguerre announced his daguerreotype process in 1839—may seem ominous, but it reflects the reality of Victorian life. In an age before antibiotics, when infant mortality soared and the Civil War raged, death was a constant presence in the United States. And one prominent part of the process of memorializing the dead was taking a postmortem photo.

Postmortem photography evolved out of posthumous portraiture, a mode of painting in which wealthy Europeans (and eventually Americans) memorialized dead family members by depicting them alongside a slew of symbols, colors, and gestures associated with death. While the people—usually children—in these images might look reasonably healthy, the presence of a dead bird, a cut cord, drooping flowers, or a three-fingered grip (a reference to the holy trinity) often signaled that the subject was deceased. These types of images, popular in the 18th and early 19th centuries, served as cherished reminders of loved ones long gone.

By the 1840s, however, the production of memorial images started moving from the artist’s studio to the photography studio—and democratized in the process. No longer were the wealthy the only ones who could afford images of loved ones, in life or death. Photography studios spread throughout the country in the 1850s, and postmortem photography reached its height a few decades later. And whereas paintings might have cost large sums, and daguerreotypes were often luxuries, the ambrotypes and tintypes that followed sometimes went for just a few cents.

For the Victorians, the postmortem photo was just one aspect of an elaborate mourning ritual that often involved covering the house and body in as much black crepe as one could afford, as well as more intimate acts like washing the corpse, watching over it, and accompanying it to the gravesite. Early photos were sometimes referred to as “mirrors with memories,” and the Victorians saw photographing the dead as one way of preserving the memory of a family member. Photos of the dead were kept as keepsakes, displayed in homes, sent to friends and relatives, worn inside lockets, or even carried as pocket mirrors.

Photographing the dead, however, was a tricky business, and required careful manipulation of the body, props, and equipment, either at the photographer’s studio or at the home of the deceased. Though the majority of postmortem images depict the dead laid out in a bed or coffin, dead children were not infrequently placed in a mother’s lap to keep them upright (echoing the Victorian fashion for “hidden mother” portraits, in which a parent or assistant was draped in fabric as a backdrop with varying degrees of success). Adults were also most frequently shown in coffins, but occasionally photographed in chairs, sometimes holding a book or other props. After the photo session, photographers manipulated the negative, too—to make the dead person’s stare look less blank, or sometimes to paint pupils over closed eyelids.

Some sense of the difficulties of postmortem photography can be gleaned from remarks by leading daguerrotype photographer Albert Southworth printed in an 1873 edition of the Philadelphia Photographer: “If a person has died, and the friends are afraid that there will be a liquid ejected from the mouth, you can carefully turn them over just as though they were under the operation of an emetic. You can do that in less than a single minute, and every single thing will pass out, and you can wipe out the mouth and wash off the face, and handle them just as well as if they were well persons.”

Today, a lot of myths about postmortem photos circulate on the internet and among the general public. One of the biggest falsehoods, says Mike Zohn, co-owner of New York’s Obscura Oddities and Antiques and a long-time postmortem photography collector and dealer, is that the world’s photo albums are filled with lively looking photos of dead people.

The Victorians “had no issue showing dead people as being dead,” Zohn tells mental_floss. “They did not try to make them look alive, that is a modern myth.” He cautions that Pinterest and other websites are full of images of living people who have been labeled as dead, sometimes with elaborate (but incorrect) explanations of the types of tools that have been used to keep them propped up. “The Victorians also did not use strings, wires, armatures, or anything else to pose the dead,” Zohn adds. “They weren’t meat puppets that were strung up and treated like meat. They were respectful and treated the dead with dignity."

Part of the problem, writes noted postmortem photography collector and scholar Stanley Burns in Sleeping Beauty II: Grief, Bereavement and the Family in Memorial Photography, American & European Traditions, is that the dead of the 19th century often looked better than the dead of today. We tend to prolong life with measures that weren’t available for the Victorians, but the epidemics of the 19th century killed quickly. “Except for children who died from dehydration or from viruses that left conspicuous skin rashes, or adults who succumbed to cancer or extreme old age,” Burns writes, “the dead would often appear to be quite healthy.”

Zohn particularly cautions against the idea that Victorians used posing stands to create upright post-mortems. "The posing stand is similar in design and strength to a modern day microphone stand," he says. "There is no way it could possibly hold up the weight of a dead body. If you see a photo with a person and a stand behind them, it’s a guarantee that the person is alive.”

Jack Mord, who runs the postmortem-focused Thanatos Archive, agrees about the posing stands. “People see the base of these stands in photos and assume it’s there to stand a dead person up … but that was never, ever the case,” Mord says. “Basically, if you see the base of a posing stand in a photo, that’s an immediate sign that the person in the photo was alive, not dead.”

Both Zohn and Mord also point out that many people have a misperception about how expensive photography was during the 19th century. Zohn says, “You could easily get a tintype taken for less than five cents—in some cases as low as one or two cents. It was well within the reach of almost all but the very poor, yet some falsely believe it was so expensive that they could only afford to have one image taken and it would have been a post mortem.” While that might have been true when the photography was first introduced—and it’s true that postmortems might have been the only photo ever made of an infant—it wasn’t a general rule.

Some books on postmortem photography mention checking the hands for signs the subject is dead, noting that swelling or discoloration can be a sign of death. But Zohn says it’s easy to misread this clue: “I’ve seen many images of clearly dead people with light-colored hands as well as clearly live people with dark hands. It’s usually caused by lighting and exposure, but could also be something such as suntanned hands that will appear darker.” A better clue, Zohn says, is the symbolism—flowers, folded hands, closed eyes. An adult lying stretched out on a bed with his or her shoes off can be a sign of a postmortem, since shoes can be hard to put on a corpse. And of course, if someone’s lying in a coffin, there’s a good chance they’re dead.

Postmortem photography more or less ended as a common practice by the 1930s in the United States, as social mores shifted away from prolonged public mourning, death became medicalized, and infant mortality rates improved. But “postmortems never truly ever ended,” Zohn says. Today, several companies specialize in taking photos of stillborn infants or newborns, and the practice of postmortem photography continues as a regular event in other parts of the world.

Today, most Americans have decided that our final image is the one we least want remembered. It’s easy for us to shut death out of our minds, and we don’t necessarily want reminders in our homes. But for the Victorians, death wasn’t weird—it was ordinary and ever-present. Burns writes that postmortems “were taken with the same lack of self-consciousness with which today’s photographer might document a party or a prom.”

Haral & Ferol Tromley, who died at home in Fremont Township, Michigan, of acute nephritis and edema of the lungs, October 1900.

Cabinet photo, circa 1905.

Philadelphia, Pennsylvania, circa 1848. Sabin W. Colton, photographer.

Silver print, ca. 1920s. On the back is written "Mrs. Conant after death."

Sixth-plate daguerreotype, circa 1845.

Sixth-plate daguerreotype, circa 1848.

"May Snyder, mother of Estell Snyder", circa 1898. Notice the photographer's reflection in the mirror.
Cabinet card; location unknown.

All photos via the Thanatos Archive, used with permission. Identifying information provided where known.

11 Facts About Johann Sebastian Bach

Illustration by Mental Floss. Image: Rischgitz, Getty Images
Illustration by Mental Floss. Image: Rischgitz, Getty Images

Johann Sebastian Bach is everywhere. Weddings? Bach. Haunted houses? Bach. Church? Bach. Shredding electric guitar solos? Look, it’s Bach! The Baroque composer produced more than 1100 works, from liturgical organ pieces to secular cantatas for orchestra, and his ideas about musical form and harmony continue to influence generations of music-makers. Here are 11 things you might not know about the man behind the music.

1. There's some disagreement about when he was actually born.

Some people celebrate Bach’s birthday on March 21. Other people light the candles on March 31. The correct date depends on whom you ask. Bach was born in Thuringia in 1685, when the German state was still observing the Julian calendar. Today, we use the Gregorian calendar, which shifted the dates by 11 days. And while most biographies opt for the March 31 date, Bach scholar Christopher Wolff firmly roots for Team 21. “True, his life was actually 11 days longer because Protestant Germany adopted the Gregorian calendar in 1700,” he told Classical MPR, “but with the legal stipulation that all dates prior to Dec. 31, 1699, remain valid.”

2. He was at the center of a musical dynasty.

Bach’s great-grandfather was a piper. His grandfather was a court musician. His father was a violinist, organist, court trumpeter, and kettledrum player. At least two of his uncles were composers. He had five brothers—all named Johann—and the three who lived to adulthood became musicians. J.S. Bach also had 20 children, and, of those who lived past childhood, at least five became professional composers. According to the Nekrolog, an obituary written by Bach’s son Carl Philipp Emanuel Bach, "[S]tarting with Veit Bach, the founding father of this family, all his descendants, down to the seventh generation, have dedicated themselves to the profession of music, with only a few exceptions."

3. He took a musical pilgrimage that puts every road trip to Woodstock to shame.

In 1705, 20-year-old Bach walked 280 miles—that's right, walked—from the city of Arnstadt to Lübeck in northern Germany to hear a concert by the influential organist and composer Dieterich Buxtehude. He stuck around for four months to study with the musician [PDF]. Bach hoped to succeed Buxtehude as the organist of Lübeck's St. Mary's Church, but marriage to one of Buxtehude's daughters was a prerequisite to taking over the job. Bach declined, and walked back home.

4. He brawled with his students.

One of Bach’s first jobs was as a church organist in Arnstadt. When he signed up for the role, nobody told him he also had to teach a student choir and orchestra, a responsibility Bach hated. Not one to mince words, Bach one day lost patience with a error-prone bassoonist, Johann Geyersbach, and called him a zippelfagottist—that is, a “nanny-goat bassoonist.” Those were fighting words. Days later, Geyersbach attacked Bach with a walking stick. Bach pulled a dagger. The rumble escalated into a full-blown scrum that required the two be pulled apart.

5. He spent 30 days in jail for quitting his job.

When Bach took a job in 1708 as a chamber musician in the court of the Duke of Saxe-Weimar, he once again assumed a slew of responsibilities that he never signed up for. This time, he took it in stride, believing his hard work would lead to his promotion to kapellmeister (music director). But after five years, the top job was handed to the former kapellmeister’s son. Furious, Bach resigned and joined a rival court. As retribution, the duke jailed him for four weeks. Bach spent his time in the slammer writing preludes for organ.

6. The Brandenburg Concertos were a failed job application.

Around 1721, Bach was the head of court music for Prince Leopold of Anhalt-Köthen. Unfortunately, the composer reportedly didn’t get along with the prince’s new wife, and he started looking for a new gig. (Notice a pattern?) Bach polished some manuscripts that had been sitting around and mailed them to a potential employer, Christian Ludwig, the Margrave of Brandenburg. That package, which included the Brandenburg Concertos—now considered some of the most important orchestral compositions of the Baroque era—failed to get Bach the job [PDF].

7. He wrote an amazing coffee jingle.

Bach apparently loved coffee enough to write a song about it: "Schweigt stille, plaudert nicht" ("Be still, stop chattering"). Performed in 1735 at Zimmerman’s coffee house in Leipzig, the song is about a coffee-obsessed woman whose father wants her to stop drinking the caffeinated stuff. She rebels and sings this stanza:

Ah! How sweet coffee tastes
More delicious than a thousand kisses
Milder than muscatel wine.
Coffee, I have to have coffee,
And, if someone wants to pamper me,
Ah, then bring me coffee as a gift!

8. If Bach challenged you to a keyboard duel, you were guaranteed to be embarrassed.

In 1717, Louis Marchand, a harpsichordist from France, was invited to play for Augustus, Elector of Saxony, and performed so well that he was offered a position playing for the court. This annoyed the court’s concertmaster, who found Marchand arrogant and insufferable. To scare the French harpsichordist away, the concertmaster hatched a plan with his friend, J.S. Bach: a keyboard duel. Bach and Marchand would improvise over a number of different styles, and the winner would take home 500 talers. But when Marchand learned just how talented Bach was, he hightailed it out of town.

9. Some of his music may have been composed to help with insomnia.

Some people are ashamed to admit that classical music, especially the Baroque style, makes them sleepy. Be ashamed no more! According to Bach’s earliest biographer, the Goldberg Variations were composed to help Count Hermann Karl von Keyserling overcome insomnia. (This story, to be fair, is disputed.) Whatever the truth, it hasn’t stopped the Andersson Dance troupe from presenting a fantastic Goldberg-based tour of performances called “Ternary Patterns for Insomnia.” Sleep researchers have also suggested studying the tunes’ effects on sleeplessness [PDF].

10. A botched eye surgery blinded him.

When Bach was 65, he had eye surgery. The “couching” procedure, which was performed by a traveling surgeon named John Taylor, involved shoving the cataract deep into the eye with a blunt instrument. Post-op, Taylor gave the composer eye drops that contained pigeon blood, mercury, and pulverized sugar. It didn’t work. Bach went blind and died shortly after. Meanwhile, Taylor moved on to botch more musical surgeries. He would perform the same procedure on the composer George Frideric Handel, who also went blind.

11. Nobody is 100 percent confident that Bach is buried in his grave.

In 1894, the pastor of St. John’s Church in Leipzig wanted to move the composer’s body out of the church graveyard to a more dignified setting. There was one small problem: Bach had been buried in an unmarked grave, as was common for regular folks at the time. According to craniologist Wilhelm His, a dig crew tried its best to find the composer but instead found “heaps of bones, some in many layers lying on top of each other, some mixed in with the remains of coffins, others already smashed by the hacking of the diggers.” The team later claimed to find Bach’s box, but there’s doubt they found the right (de)composer. Today, Bach supposedly resides in Leipzig’s St. Thomas Church.

15 Positively Reinforcing Facts About B.F. Skinner

Silly rabbit via Wikimedia Commons // CC BY 3.0
Silly rabbit via Wikimedia Commons // CC BY 3.0

Burrhus Frederic Skinner was one of the preeminent American psychologists of the 20th century. B.F. Skinner founded “radical behaviorism”—a twist on traditional behaviorism, a field of psychology that focused exclusively on observable human behavior. Thoughts, feelings, and perceptions were cast aside as unobservable.

B.F. Skinner dubbed his own method of observing behavior “operant conditioning,” which posited that behavior is determined solely by its consequences—either reinforcements or punishments. He also coined the term "positive reinforcement." 

To Skinner’s critics, the idea that these “principles of reinforcement,” as he called them, lead to easy “behavior modification” suggested that we do not have free will and are little more than automatons acting in response to stimuli. But his fans considered him visionary. Controversial to the end, B.F. Skinner was well known for his unconventional methods, unusual inventions, and utopian—some say dystopian—ideas about human society.

1. B.F. Skinner invented the "operant conditioning" or "Skinner" box.

Skinner believed that the best way to understand behavior is to look at the causes of an action and its consequences. He called this approach “operant conditioning.” Skinner began by studying rats interacting with an environment inside a box, where they were rewarded with a pellet of food for responding to a stimulus like light or sound with desired behavior. This simple experiment design would over the years take on dark metaphorical meaning: Any environment that had mechanisms in place to manipulate or control behavior could be called a "Skinner box." Recently, some have argued that social media is a sort of digital Skinner box: Likes, clicks, and shares are the pellet-like rewards we get for responding to our environment with certain behavior. Yes, we are the rats.

2. B.F. Skinner believed that all behavior was affected by one of three "operants."

Skinner proposed there were only three “operants” that had affected human behavior. Neutral operants were responses from the environment that had a benign effect on a behavior. Reinforcers were responses that increased the likelihood of a behavior’s repetition. And punishers decreased the likelihood of a behavior’s repetition. While he was correct that behavior can be modified via this system, it’s only one of many methods for doing so, and it failed to take into account how emotions, thoughts, and—as we learned eventually—the brain itself account for changes in behavior.

3. He's responsible for the term "positive reinforcement."

B.F. Skinner eventually moved on to studying pigeons in his Skinner box. The pigeons would peck at a disc to gain access to food at various intervals, and for completing certain tasks. From this Skinner concluded that some form of reinforcement was crucial in learning new behaviors. To his mind, positive reinforcement strengthens a behavior by providing a consequence an individual finds rewarding. He concluded that reinforced behavior tends to be repeated and strengthened.

4. Some critics felt "positive reinforcement" amounted to bribery.

Critics were dubious that Skinner's focus on behavior modification through positive reinforcement of desired behavior could actually change behavior for the long term, and that it was little more than temporary reward, like bribery, for a short-term behavioral change.

5. B.F. Skinner's idea of "negative reinforcement" isn't what you think.

Skinner believed negative reinforcement also helped to strengthen behavior; this doesn't mean exposing an animal or person to a negative stimulus, but rather removing an “unpleasant reinforcer.” The idea was that removing the negative stimulus would feel like a “reward” to the animal or person.

6. B.F. Skinner taught pigeons to play ping-pong.

As part of his research into positive reinforcement, he taught pigeons to play ping-pong as a first step in seeing how trainable they were. He ultimately wanted to teach them to guide bombs and missiles and even convinced the military to fund his research to that effect. He liked working with pigeons because they responded well to reinforcements and punishments, thus validating his theories. We know now that pigeons can be trained in a whole host of tasks, including distinguishing written words from nonsense and spotting cancer.

7. B.F. Skinner's first book, The Behavior of Organisms, broke new ground.

Published in 1938, Skinner’s debut book made the case that simple observation of cause and effect, reward and punishment, were as significant to understanding behavior as other “conceptual or neural processes.”

Skinner believed behavior was everything. Thoughts and feelings were just unreliable byproducts of behaviors, he argued—and therefore dismissed them. Many of his fellow psychologists disagreed. Regardless, Skinner’s theories contributed to a greater understanding of the relationship between stimuli and resulting behavior and may have even laid the groundwork for understanding the brain’s reward circuitry, which centers around the amygdala.

8. B.F. Skinner created the "baby tender."

Skinner was fond of inventions, and having children gave him a new outlet for his tendencies. He designed a special crib for his infant daughter called “the baby tender.” The clear box, with air holes, was heated so that the baby didn't need blankets. Unlike typical cribs, there were no slats in the sides, which he said prevented possible injury. Unsurprisingly, it did not catch on with the public.

9. B.F. Skinner also developed his own "teaching machine."


Silly rabbit via Wikimedia Commons // CC BY 3.0

You may have Skinner to thank for modern school workbooks and test-taking procedures. In 1954 Skinner visited his daughter’s classroom and found himself frustrated with the “inefficiencies” of the teaching procedures. His first "teaching machine"—a very basic program to improve teaching methods for spelling, math, and other school subjects—was little more than a fill-in-the-blank method on workbook or computer. It’s now considered a precursor to computer-assisted learning programs.

10. Skinner imaged an ideal society based on his theories of human behavior.

Skinner admired Henry David Thoreau’s famous book Walden, in which Thoreau writes about his retreat to the woods to get in greater contact with his inner nature. Skinner's "Ten Commandments" for a utopian world include: “(1) No way of life is inevitable. Examine your own closely. (2) If you do not like it, change it. (3) But do not try to change it through political action. Even if you succeed in gaining power, you will not likely be able to use it any more wisely than your predecessors. (4) Ask only to be left alone to solve your problems in your own way. (5) Simplify your needs. Learn how to be happy with fewer possessions.”

11. B.F. Skinner wrote a utopian novel, Walden Two.

Though inspired by Walden, Skinner also felt the book was too self-indulgent, so he wrote his own fictional follow-up with the 1948 novel Walden Two. The book proposed a type of utopian—some say dystopian—society that employed a system of behavior modification based on operant conditioning. This system of rewards and punishments would, Skinner proposed, make people into good citizens:

“We can achieve a sort of control under which the controlled, though they are following a code much more scrupulously than was ever the case under the old system, nevertheless feel free. They are doing what they want to do, not what they are forced to do. That's the source of the tremendous power of positive reinforcement—there's no restraint and no revolt. By careful cultural design, we control not the final behavior, but the inclination to behave—the motives, desires, the wishes.”

12. Some felt Skinner's ideas were reductionist ...

Critics, of which there were many, felt he reduced human behavior to a series of actions and reactions: that an individual human “mind” only existed in a social context, and that humans could be easily manipulated by external cues. He did not put much store in his critics. Even at age 83, just three years before he died, he told Daniel Goleman in a 1987 New York Times article, “I think cognitive psychology is a great hoax and a fraud, and that goes for brain science, too. They are nowhere near answering the important questions about behavior.”

13. ... and others were horrified by Walden Two.

Astronomer and colleague JK Jessup wrote, “Skinner's utopian vision could change the nature of Western civilization more disastrously than the nuclear physicists and biochemists combined.”

14. B.F. Skinner implied that humans had no free will or individual consciousness.

In the late 1960s and early '70s, Skinner wrote several works applying his behavioral theories to society, including Beyond Freedom and Dignity (1971). He drew fire for implying that humans had no free will or individual consciousness but could simply be controlled by reward and punishment. His critics shouldn't have been surprised: this was the very essence of his behaviorism. He, however, was unconcerned with criticism. His daughter Julie S. Vargas has written that “Skinner felt that by answering critics (a) you showed that their criticism affected you; and (b) you gave them attention, thus raising their reputation. So he left replies to others.”

15. He died convinced that the fate of humanity lay in applying his methods of behavioral science to society.

In 1990, he died of leukemia at age 86 after receiving a Lifetime Achievement Award from the American Psychological Association. Proud of his work, he was nonetheless concerned about the fate of humanity and worried “about daily life in Western culture, international conflict and peace, and why people were not acting to save the world.”

SECTIONS

arrow
LIVE SMARTER