Why Did People Wear Powdered Wigs?

For nearly two centuries, powdered wigs—called perukes—were all the rage. The chic hairpiece would have never become popular, however, if it hadn't been for a venereal disease, a pair of self-conscious kings, and poor hair hygiene.  

The peruke’s story begins like many others—with syphilis. By 1580, the STD had become the worst epidemic to strike Europe since the Black Death. According to William Clowes, an “infinite multitude” of syphilis patients clogged London’s hospitals, and more filtered in each day. Without antibiotics, victims faced the full brunt of the disease: open sores, nasty rashes, blindness, dementia, and patchy hair loss. Baldness swept the land.

At the time, hair loss was a one-way ticket to public embarrassment. Long hair was a trendy status symbol, and a bald dome could stain any reputation. When Samuel Pepys’s brother acquired syphilis, the diarist wrote, “If [my brother] lives, he will not be able to show his head—which will be a very great shame to me.” Hair was that big of a deal.

Cover-Up

And so, the syphilis outbreak sparked a surge in wigmaking. Victims hid their baldness, as well as the bloody sores that scoured their faces, with wigs made of horse, goat, or human hair. Perukes were also coated with powder—scented with lavender or orange—to hide any funky aromas. Although common, wigs were not exactly stylish. They were just a shameful necessity. That changed in 1655, when the King of France started losing his hair.

Louis XIV was only 17 when his mop started thinning. Worried that baldness would hurt his reputation, Louis hired 48 wigmakers to save his image. Five years later, the King of England—Louis’s cousin, Charles II—did the same thing when his hair started to gray (both men likely had syphilis). Courtiers and other aristocrats immediately copied the two kings. They sported wigs, and the style trickled down to the upper-middle class. Europe’s newest fad was born.

The cost of wigs increased, and perukes became a scheme for flaunting wealth. An everyday wig cost about 25 shillings—a week’s pay for a common Londoner. The bill for large, elaborate perukes ballooned to as high as 800 shillings. The word “bigwig” was coined to describe snobs who could afford big, poufy perukes.

When Louis and Charles died, wigs stayed around. Perukes remained popular because they were so practical. At the time, head lice were everywhere, and nitpicking was painful and time-consuming. Wigs, however, curbed the problem. Lice stopped infesting people’s hair—which had to be shaved for the peruke to fit—and camped out on wigs instead. Delousing a wig was much easier than delousing a head of hair: you’d send the dirty headpiece to a wigmaker, who would boil the wig and remove the nits.

Wig Out

By the late 18th century, the trend was dying out. French citizens ousted the peruke during the Revolution, and Brits stopped wearing wigs after William Pitt levied a tax on hair powder in 1795. Short, natural hair became the new craze, and it would stay that way for another two centuries or so.

Is There An International Standard Governing Scientific Naming Conventions?

iStock/Grafissimo
iStock/Grafissimo

Jelle Zijlstra:

There are lots of different systems of scientific names with different conventions or rules governing them: chemicals, genes, stars, archeological cultures, and so on. But the one I'm familiar with is the naming system for animals.

The modern naming system for animals derives from the works of the 18th-century Swedish naturalist Carl von Linné (Latinized to Carolus Linnaeus). Linnaeus introduced the system of binominal nomenclature, where animals have names composed of two parts, like Homo sapiens. Linnaeus wrote in Latin and most his names were of Latin origin, although a few were derived from Greek, like Rhinoceros for rhinos, or from other languages, like Sus babyrussa for the babirusa (from Malay).

Other people also started using Linnaeus's system, and a system of rules was developed and eventually codified into what is now called the International Code of Zoological Nomenclature (ICZN). In this case, therefore, there is indeed an international standard governing naming conventions. However, it does not put very strict requirements on the derivation of names: they are merely required to be in the Latin alphabet.

In practice a lot of well-known scientific names are derived from Greek. This is especially true for genus names: Tyrannosaurus, Macropus (kangaroos), Drosophila (fruit flies), Caenorhabditis (nematode worms), Peromyscus (deermice), and so on. Species names are more likely to be derived from Latin (e.g., T. rex, C. elegans, P. maniculatus, but Drosophila melanogaster is Greek again).

One interesting pattern I've noticed in mammals is that even when Linnaeus named the first genus in a group by a Latin name, usually most later names for related genera use Greek roots instead. For example, Linnaeus gave the name Mus to mice, and that is still the genus name for the house mouse, but most related genera use compounds of the Greek-derived root -mys (from μῦς), which also means "mouse." Similarly, bats for Linnaeus were Vespertilio, but there are many more compounds of the Greek root -nycteris (νυκτερίς); pigs are Sus, but compounds usually use Greek -choerus (χοῖρος) or -hys/-hyus (ὗς); weasels are Mustela but compounds usually use -gale or -galea (γαλέη); horses are Equus but compounds use -hippus (ἵππος).

This post originally appeared on Quora. Click here to view.

Can Soap Get Dirty?

iStock/vintagerobot
iStock/vintagerobot

When you see lovely little bars of lemon-thyme or lavender hand soaps on the rim of a sink, you know they are there to make you feel as fresh as a gardenia-scented daisy. We all know washing our hands is important, but, like washcloths and towels, can the bars of hand soap we use to clean ourselves become dirty as well?

Soaps are simply mixtures of sodium or potassium salts derived from fatty acids and alkali solutions during a process called saponification. Each soap molecule is made of a long, non-polar, hydrophobic (repelled by water) hydrocarbon chain (the "tail") capped by a polar, hydrophilic (water-soluble) "salt" head. Because soap molecules have both polar and non-polar properties, they're great emulsifiers, which means they can disperse one liquid into another.

When you wash your dirty hands with soap and water, the tails of the soap molecules are repelled by water and attracted to oils, which attract dirt. The tails cluster together and form structures called micelles, trapping the dirt and oils. The micelles are negatively charged and soluble in water, so they repel each other and remain dispersed in water—and can easily be washed away.

So, yes, soap does indeed get dirty. That's sort of how it gets your hands clean: by latching onto grease, dirt and oil more strongly than your skin does. Of course, when you're using soap, you're washing all those loose, dirt-trapping, dirty soap molecules away, but a bar of soap sitting on the bathroom counter or liquid soap in a bottle can also be contaminated with microorganisms.

This doesn't seem to be much of a problem, though. In the few studies that have been done on the matter, test subjects were given bars of soap laden with E. coli and other bacteria and instructed to wash up. None of the studies found any evidence of bacteria transfer from the soap to the subjects' hands. (It should be noted that two of these studies were conducted by Procter & Gamble and the Dial Corp., though no contradictory evidence has been found.)

Dirty soap can't clean itself, though. A contaminated bar of soap gets cleaned via the same mechanical action that helps clean you up when you wash your hands: good ol' fashioned scrubbing. The friction from rubbing your hands against the soap, as well as the flushing action of running water, removes any harmful microorganisms from both your hands and the soap and sends them down the drain.

This story was updated in 2019.

SECTIONS

arrow
LIVE SMARTER