CLOSE

Irish Teeth Reveal the Chemical Signature of the Great Famine

The Custom House Famine Memorial in Dublin. Image credit: William Murphy via Flickr // CC BY-SA 2.0

 
The Great Famine in Ireland, one of the worst starvations in human history, lasted from 1845 to 1852. Sometimes called the “Irish Potato Famine” due to a disease that ravaged the crop that many Irish diets were based on, this period saw the population of Ireland decrease by about one-quarter. Around 1 million people died from starvation and other diseases, while another million or so left Ireland for new lives elsewhere in Europe and the U.S. While historically, the famine is well known, research into its physical effects is a comparatively new topic in archaeology.

A novel study by Julia Beaumont of the University of Bradford and Janet Montgomery of Durham University, published recently in PLOS One, tackles the question of how to identify famine and other chronic stress from specific skeletons. They focus their analysis on human remains from the Kilkenny Union Workhouse in southeast Ireland, just one of the many workhouses that sprang up after 1838, when a law was passed to help “remedy” poverty by institutionalizing the poor and making them work long hours. Individuals and entire families would enter the workhouse, which was segregated by age and sex, overcrowded, and full of sick people.

At least 970 people were buried in mass graves at Kilkenny, in unconsecrated ground. The researchers focused on the teeth of 20 of them, representing a cross-section of age and sex. Six had died before age 9.

The failure of the potato crop shortly after the emergence of Irish workhouses meant reduced food for the poor and, as a consequence, a significant amount of sickness and death among this vulnerable population. Although the government was slow to respond to the food crisis, eventually it began to import corn from America to feed the poor. And this introduction of corn is particularly helpful archaeologically, because corn's chemical composition is very different from that of potatoes and Old World grains. Archaeologists who analyze human bones and teeth can see the dramatic differences in corn-based and wheat-based diets by measuring the ratio of the two carbon isotopes in the skeleton.

The first important finding from Beaumont and Montgomery’s study is that, for many of the 20 people they analyzed, they could see the carbon isotopes rising after the start of the Great Famine. By micro-sampling the dentine portion of teeth at various stages of formation, they show an increase in corn consumption through time that correlates well with historical information about diet.

But their second finding is even more intriguing: Even as carbon isotopes increased, the nitrogen isotopes decreased. Archaeologists use nitrogen isotopes to understand the amount of protein in a diet. If you are a carnivore and eat food high on the food chain, you have a higher nitrogen isotope signature than if you are a vegetarian. The drop in nitrogen isotopes the researchers found in the teeth that occurred after the introduction of corn does not track with historical records; there is no known change in the protein that the poor were eating at this time.

Beaumont and Montgomery argue that the change in isotopes reflects a cycle of starvation. The high nitrogen values prior to the introduction of corn don't suggest these people had a lot of meat protein to eat. Instead, these isotopes most likely indicate that their bodies, starving, were in a sense eating themselves, by recycling their own protein and fat. When the Kilkenny workers started eating corn, their nitrogen values dropped as their bodies were able to use corn for survival.

The researchers say the “famine pattern” in this historic Irish population is therefore one of average carbon values paired with high nitrogen values, followed by higher carbon and lower nitrogen values when corn is introduced to stave off starvation.

Beaumont and Montgomery see this pattern in the teeth of children who died in the workhouse during the famine, but also in the teeth of some of the adults. Since teeth form during childhood, this finding suggests that the adults suffered from—and overcame—one or more periods of chronic stress prior to the Great Famine. These stresses might have been caused by famine, but prolonged disease can leave similar isotopic traces, so they can't say for sure the adults experienced multiple periods of starvation.

This research comes at a time when micro-sampling of teeth is becoming a popular technique in archaeology. A recent study by researchers at McMaster University, for example, micro-sampled tooth dentine to look at cases of rickets, a deficiency of vitamin D.

Beaumont has plans to expand this research and to correlate this new methodology with other techniques useful for finding evidence of famine. “I have some teeth from other populations with nutritional deficiencies which I am micro-sampling to try to achieve a resolution that matches the physical signs, such as enamel hypoplasias," Beaumont tells mental_floss. (An enamel hypoplasia is a defect in tooth enamel.) "I want to work with others in the field to investigate the histology.”

Studies into ancient diets are not just useful for archaeologists; sadly, starvation and famine are not things of the past. Their findings can also be used by forensic anthropologists investigating recent deaths, especially, as the researchers write, “of populations and individuals for whom nutritional stress may have contributed to their death.” This work may prove critically important in the future for solving forensic cases of fatally malnourished children.

As for the skeletal remains of the 20 people studied—they were all re-interred at the Famine Memorial Garden in Kilkenny.

nextArticle.image_alt|e
iStock
arrow
science
New Clear Coating for Everyday Objects Repels Practically All Liquids
iStock
iStock

A new clear coating that is said to repel just about everything—peanut butter included—aims to halt the advance of sticky fingers. Developed by researchers at the University of Michigan, the substance can be applied to a variety of surfaces to keep them smudge- and crud-free, including smartphone and laptop screens, windows, walls, and countertops.

Researchers used algorithms to predict which substances would yield an efficient omniphobic coating, or in other words, something capable of repelling oils, alcohols, and other liquids while remaining durable and smooth. Made from a mix of fluorinated polyurethane and a fluid-repellent molecule called F-POSS, the coating can be “sprayed, brushed, dipped, or spin-coated onto a wide variety of surfaces, where it binds tightly,” according to the University of Michigan’s website.

The team’s findings were published in the March issue of the journal ACS Applied Materials Interfaces. Associate professor Anish Tuteja, who headed up the University of Michigan research team, says it could be a godsend for parents of young tots.

"I have a 2-year-old at home, so for me, this particular project was about more than just the science," Tuteja said in a statement. "We're excited about what this could do to make homes and daycares cleaner places, and we're looking at a variety of possible applications in industry as well."

The team is currently conducting follow-up tests to ensure the coating is nontoxic, but if all checks out, it could find its way into kindergarten classes and daycare centers within the next two years.

Child-proofing everyday objects for the sake of cleanliness isn’t its only potential application, though. The university notes that it could be beneficial to “all industries that depend on the condensation of liquids,” such as refrigeration, power generation, and oil refining.

In recent years, other researchers have set out to create omniphobic coatings, some of which have been successful. However, this undertaking is typically challenging and involves complex synthetic chemistry, according to Chemistry World.

nextArticle.image_alt|e
iStock
arrow
Food
Why You Never See Fresh Olives at the Grocery Store
iStock
iStock

If given a choice, most grocery shoppers prefer fresh produce over something that's been pumped full of preservatives. Yet shoppers are almost never given that choice when it comes to olives. The small, meaty fruits can be found floating in brines, packed in cans, and stuffed with pimentos, but they're hardly ever shipped to the store straight off the tree. As the video series Reactions explains, there's a good reason for that.

In their natural state, because they contain high concentrations of a bitter-tasting compound called oleuropein, fresh olives are practically inedible. To make the food palatable, olive producers have to get rid of these nasty-tasting chemicals, either by soaking them in water, fermenting them in salt brine, or treating them with sodium hydroxide.

Because of its speed, food manufacturers prefer the sodium hydroxide method. Commonly known as lye, sodium hydroxide accelerates the chemical breakdown of oleuropein into compounds that have a less aggressive taste. While other processes can take several weeks to work, sodium hydroxide only takes one week.

Afterward, the olives are washed to remove the caustic lye, then packed with water and salt to extend their shelf life, giving them their distinct briny flavor.

For more on the chemistry of olives, check out the full video from Reactions below.

[h/t Reactions]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios