Original image
Jeen Na

A Short History of the Apple

Original image
Jeen Na

Photograph by Flickr user Jeen Na.

In September, thoughts turn to different seasonal foods. As the tomato prices start to climb and the garden peters out, we look forward to a winter of turkey, pumpkin, and sweets. But in between, apples are abundant, ripe, and delicious. The apple (Malus domestica) is a member of the rose family. Believe it or not, there are thousands of cultivars of apples. The United States is the second-biggest producer of apples, behind China. Apples originated in central Asia, probably in Kazakhstan, Kyrgyzstan, or western China. They were taken to Rome and Greece by Silk Road traders, and came to the rest of Europe with the Romans.

Apples have been documented as food for thousands of years. They are often associated with the Garden of Eden. However, the fruit from "the tree of the knowledge of good and evil" was never named as any particular fruit we would know. The apple became associated with the fruit because the written form of the Latin word malum means both "apple" and "evil." The word malum was used in a fifth-century Latin translation of the Bible, and the apple has been associated with the Garden ever since. Modern scientists point to increased nutrition as the reason human brains developed to the point of self-awareness and the "knowledge of good and evil," but the current theory is that meat was the key food in human brain development. Especially cooked meat

Apples are present in mythology and culture from ancient times. Apples made of gold feature prominently in Greek myths, like the story of Atalanta, who would outrace any suitor until the wise Hippomenes slowed her down with the temptation of golden apples. Aphrodite, Hera and Athena argued over who deserved the golden apple, and set off the Trojan War. Hera owned the Garden of the Hesperides, in which golden apples grew that would confer immortality to those who ate them.

European settlers brought apples, and apple seeds, with them to America. Colonial apple trees were cultivated to produce cider more than for eating the fruit, because apple cider was tastier than water, safer than whiskey, and cheaper than beer. The sour apples of the time were better suited for cider, anyway. The focus on eating apples instead of drinking them is traced to Prohibition, when apple producers were afraid of losing their market and began pushing apples as a delicious and nutritious food.

Johnny Appleseed is a legendary figure in American folklore: the man who walked barefoot through the American frontier, planting apples wherever he went, because he believed in their value and wanted everyone to eat apples. There's truth in the legend, although John Chapman's life was a bit more complicated. Chapman was born in 1774 in Massachusetts. He became an orchardist and nurseryman as an apprentice to a farmer who grew apples. Stricken with a lifelong case of wanderlust, Chapman moved ever westward through the American frontier, preaching the Gospel as a New Church missionary. Meanwhile, he made his living selling young apple trees. He would move deep into the frontier, plant a field of apple seeds, and make his rounds, returning to tend his nurseries every year. When settlers arrived in those areas a few years later, he would sell them apple trees. Chapman did not believe in riding horses, hunting, or eating meat. He lived simply, and made friends with settlers and Indians alike, becoming very popular in his time. Although he never had a permanent home, he was welcome in many homes. Still, he would have had a difficult time selling apple seedlings today. The trees he grew from seed were fairly sour compared to modern eating apples, but it didn't matter because they were mostly made into apple cider. His trees took root and provided quite a variety of apple genes to West Virginia, Pennsylvania, Ohio, Indiana, and Illinois.

Photograph by Flickr user Mary Beth Griffo Rigby.

It has been estimated that during the 19th century, Americans drank an average of 32 gallons of apple cider every year. In the early 20th century, German immigrants made beer popular, taking away some of cider's market. Then in 1919 the Volstead Act outlawed all alcoholic beverages. Many apple orchards went out of business. But there were apples that were good for eating instead of making cider. The Delicious apple was born in 1870 in Jesse Hiatt's orchard in Peru, Iowa. A tree seedling that refused to die eventually bore the apple variety, which Hiatt nursed to maturity and sent samples to the Apple Fair in Louisiana in 1893. Clarence M. Stark, President of Stark Nurseries, dubbed it "delicious" and that's how the apple got its name. Stark bought the propagation rights. The Delicious apple was no good for cider, and too soft and bland for cooking, but it was good to eat raw. With the popularity of the Delicious and other sweet apples, the industry regained its market after Prohibition. Other cultivars were offered for making pies, apple butter, and applesauce.

Photograph by Flickr user Bill Barber.

The apples you see in grocery stores today are clones. Apple trees will reproduce readily in the wild, but there is no simple way of controlling reproduction, and the offspring of any two apple trees may produce fruit that has no resemblance to either parent. So to get a certain kind of fruit, growers will graft limbs from an existing tree onto a younger, sturdier trunk, called the rootstock. The fruit will be the offspring of the grafted branch. Such grafting allows large orchards to deliver a consistent product, but it also limits the variety of apples available in grocery stores. Fortunately, there are people devoted to discovering trees that produce a wider variety, with the aim of resurrecting and preserving those apples by grafting branches to younger rootstock. The future of apples may be a return to the heirloom varieties our ancestors knew -plus varieties never eaten before.

Original image
iStock // Ekaterina Minaeva
Man Buys Two Metric Tons of LEGO Bricks; Sorts Them Via Machine Learning
Original image
iStock // Ekaterina Minaeva

Jacques Mattheij made a small, but awesome, mistake. He went on eBay one evening and bid on a bunch of bulk LEGO brick auctions, then went to sleep. Upon waking, he discovered that he was the high bidder on many, and was now the proud owner of two tons of LEGO bricks. (This is about 4400 pounds.) He wrote, "[L]esson 1: if you win almost all bids you are bidding too high."

Mattheij had noticed that bulk, unsorted bricks sell for something like €10/kilogram, whereas sets are roughly €40/kg and rare parts go for up to €100/kg. Much of the value of the bricks is in their sorting. If he could reduce the entropy of these bins of unsorted bricks, he could make a tidy profit. While many people do this work by hand, the problem is enormous—just the kind of challenge for a computer. Mattheij writes:

There are 38000+ shapes and there are 100+ possible shades of color (you can roughly tell how old someone is by asking them what lego colors they remember from their youth).

In the following months, Mattheij built a proof-of-concept sorting system using, of course, LEGO. He broke the problem down into a series of sub-problems (including "feeding LEGO reliably from a hopper is surprisingly hard," one of those facts of nature that will stymie even the best system design). After tinkering with the prototype at length, he expanded the system to a surprisingly complex system of conveyer belts (powered by a home treadmill), various pieces of cabinetry, and "copious quantities of crazy glue."

Here's a video showing the current system running at low speed:

The key part of the system was running the bricks past a camera paired with a computer running a neural net-based image classifier. That allows the computer (when sufficiently trained on brick images) to recognize bricks and thus categorize them by color, shape, or other parameters. Remember that as bricks pass by, they can be in any orientation, can be dirty, can even be stuck to other pieces. So having a flexible software system is key to recognizing—in a fraction of a second—what a given brick is, in order to sort it out. When a match is found, a jet of compressed air pops the piece off the conveyer belt and into a waiting bin.

After much experimentation, Mattheij rewrote the software (several times in fact) to accomplish a variety of basic tasks. At its core, the system takes images from a webcam and feeds them to a neural network to do the classification. Of course, the neural net needs to be "trained" by showing it lots of images, and telling it what those images represent. Mattheij's breakthrough was allowing the machine to effectively train itself, with guidance: Running pieces through allows the system to take its own photos, make a guess, and build on that guess. As long as Mattheij corrects the incorrect guesses, he ends up with a decent (and self-reinforcing) corpus of training data. As the machine continues running, it can rack up more training, allowing it to recognize a broad variety of pieces on the fly.

Here's another video, focusing on how the pieces move on conveyer belts (running at slow speed so puny humans can follow). You can also see the air jets in action:

In an email interview, Mattheij told Mental Floss that the system currently sorts LEGO bricks into more than 50 categories. It can also be run in a color-sorting mode to bin the parts across 12 color groups. (Thus at present you'd likely do a two-pass sort on the bricks: once for shape, then a separate pass for color.) He continues to refine the system, with a focus on making its recognition abilities faster. At some point down the line, he plans to make the software portion open source. You're on your own as far as building conveyer belts, bins, and so forth.

Check out Mattheij's writeup in two parts for more information. It starts with an overview of the story, followed up with a deep dive on the software. He's also tweeting about the project (among other things). And if you look around a bit, you'll find bulk LEGO brick auctions online—it's definitely a thing!

Original image
One Bite From This Tick Can Make You Allergic to Meat
Original image

We like to believe that there’s no such thing as a bad organism, that every creature must have its place in the world. But ticks are really making that difficult. As if Lyme disease wasn't bad enough, scientists say some ticks carry a pathogen that causes a sudden and dangerous allergy to meat. Yes, meat.

The Lone Star tick (Amblyomma americanum) mostly looks like your average tick, with a tiny head and a big fat behind, except the adult female has a Texas-shaped spot on its back—thus the name.

Unlike other American ticks, the Lone Star feeds on humans at every stage of its life cycle. Even the larvae want our blood. You can’t get Lyme disease from the Lone Star tick, but you can get something even more mysterious: the inability to safely consume a bacon cheeseburger.

"The weird thing about [this reaction] is it can occur within three to 10 or 12 hours, so patients have no idea what prompted their allergic reactions," allergist Ronald Saff, of the Florida State University College of Medicine, told Business Insider.

What prompted them was STARI, or southern tick-associated rash illness. People with STARI may develop a circular rash like the one commonly seen in Lyme disease. They may feel achy, fatigued, and fevered. And their next meal could make them very, very sick.

Saff now sees at least one patient per week with STARI and a sensitivity to galactose-alpha-1, 3-galactose—more commonly known as alpha-gal—a sugar molecule found in mammal tissue like pork, beef, and lamb. Several hours after eating, patients’ immune systems overreact to alpha-gal, with symptoms ranging from an itchy rash to throat swelling.

Even worse, the more times a person is bitten, the more likely it becomes that they will develop this dangerous allergy.

The tick’s range currently covers the southern, eastern, and south-central U.S., but even that is changing. "We expect with warming temperatures, the tick is going to slowly make its way northward and westward and cause more problems than they're already causing," Saff said. We've already seen that occur with the deer ticks that cause Lyme disease, and 2017 is projected to be an especially bad year.

There’s so much we don’t understand about alpha-gal sensitivity. Scientists don’t know why it happens, how to treat it, or if it's permanent. All they can do is advise us to be vigilant and follow basic tick-avoidance practices.

[h/t Business Insider]