CLOSE
Original image
Youtube

9 Reasons This Sign Language Version of “'Twas the Night Before Christmas” is Great

Original image
Youtube

Sheena McFeely is a deaf mom with a YouTube channel* where she and her husband Manny Johnson teach signs with the help of their two adorable daughters, one deaf, one hearing, both native, fluent users of American Sign Language. She recently posted this wonderful video of Shaylee, who is deaf, signing a version of “’Twas the Night Before Christmas.”

You don’t have to know anything about sign language to be blown away by the sheer force of personality coming through in Shaylee’s performance. But with a little knowledge of how ASL works, you can also be amazed by the complexity of her linguistic and storytelling skills. Here are nine great moments from Shaylee’s video.

1. At 0:30, she signs a complex sentence with a topic-comment structure. She introduces a long noun phrase, (“a mouse that was running about”) and says something about it (“is now still”). The topic noun phrase is indicated by her eyebrow raise. She lowers her eyebrows appropriately for the comment part. A big sentence for a little girl.

2. Here, she uses a discourse strategy called role shift to great effect. She introduces the stockings as straight narration, with her eye gaze straight ahead, but then, describing how the stockings looked, she shifts her gaze toward the point in space where she has established their location, allowing her to use her facial expression to express a reaction to their beauty. And what an expression!

3. Again, she uses role shift, this time to provide coherence for a series of clauses. She introduces the children in straight narration, and then, with her face, adopts the role of the sleeping children, while maintaining the narration with her signs. The role adoption lasts as long as she produces clauses which have “the children” as their subject. Then she effortlessly shifts back out of the role. Anyone who has tried to learn ASL as a second language can tell you this is not easy to do.

4. This role shift, where her slightly worried expression represents dad’s reaction, also provides coherence. She adopts the dad role with her face as he springs from bed, then shifts to neutral narration to explain that it was because of a noise, then shifts back to dad as the action continues.

5. She makes the sign for “old” in an exaggerated, extra long way. It’s like she saying “old” with a slow, creaky, old person’s voice. Great, engaged storytelling.

6. In this performance, she’s not only representing a neutral narrator and a bunch of roles within the story, she’s also herself with her own opinions. Here, for a moment, her own feelings about Santa shine through, without breaking the rhythm of the story.

7. She continues to shift perspectives smoothly from dad to narrator to Santa and back without missing a beat.

8. This is a great illustration of how what she is doing with her role shifting is very different from simple playacting or pantomime. Her head turns to show Santa’s head turning, and she winks to show Santa winking, but at the same time she produces the correct ASL signs for head movement (the flat “base” hand, the fist, the orientation change from palm into to palm out) and the ASL sign for wink (an actual wink is not an ASL sign). She is acting and performing and expressing emotions and moods, but all within a linguistic context—just as you would be doing with your voice and face if you were telling this story (assuming you were any good at it, that is).

9. “Merry Christmas to all and to all a good night.” As she slows down to deliver the last line, she holds your attention in the palms of her capable little hands. Can you hear, and see, Santa’s voice echoing over the quiet, snowy landscape? Was there ever a sweeter end to this poem?

*To turn on the English captions for the videos, click the CC button at the bottom of the YouTube screen.

Original image
iStock // Ekaterina Minaeva
technology
arrow
Man Buys Two Metric Tons of LEGO Bricks; Sorts Them Via Machine Learning
May 21, 2017
Original image
iStock // Ekaterina Minaeva

Jacques Mattheij made a small, but awesome, mistake. He went on eBay one evening and bid on a bunch of bulk LEGO brick auctions, then went to sleep. Upon waking, he discovered that he was the high bidder on many, and was now the proud owner of two tons of LEGO bricks. (This is about 4400 pounds.) He wrote, "[L]esson 1: if you win almost all bids you are bidding too high."

Mattheij had noticed that bulk, unsorted bricks sell for something like €10/kilogram, whereas sets are roughly €40/kg and rare parts go for up to €100/kg. Much of the value of the bricks is in their sorting. If he could reduce the entropy of these bins of unsorted bricks, he could make a tidy profit. While many people do this work by hand, the problem is enormous—just the kind of challenge for a computer. Mattheij writes:

There are 38000+ shapes and there are 100+ possible shades of color (you can roughly tell how old someone is by asking them what lego colors they remember from their youth).

In the following months, Mattheij built a proof-of-concept sorting system using, of course, LEGO. He broke the problem down into a series of sub-problems (including "feeding LEGO reliably from a hopper is surprisingly hard," one of those facts of nature that will stymie even the best system design). After tinkering with the prototype at length, he expanded the system to a surprisingly complex system of conveyer belts (powered by a home treadmill), various pieces of cabinetry, and "copious quantities of crazy glue."

Here's a video showing the current system running at low speed:

The key part of the system was running the bricks past a camera paired with a computer running a neural net-based image classifier. That allows the computer (when sufficiently trained on brick images) to recognize bricks and thus categorize them by color, shape, or other parameters. Remember that as bricks pass by, they can be in any orientation, can be dirty, can even be stuck to other pieces. So having a flexible software system is key to recognizing—in a fraction of a second—what a given brick is, in order to sort it out. When a match is found, a jet of compressed air pops the piece off the conveyer belt and into a waiting bin.

After much experimentation, Mattheij rewrote the software (several times in fact) to accomplish a variety of basic tasks. At its core, the system takes images from a webcam and feeds them to a neural network to do the classification. Of course, the neural net needs to be "trained" by showing it lots of images, and telling it what those images represent. Mattheij's breakthrough was allowing the machine to effectively train itself, with guidance: Running pieces through allows the system to take its own photos, make a guess, and build on that guess. As long as Mattheij corrects the incorrect guesses, he ends up with a decent (and self-reinforcing) corpus of training data. As the machine continues running, it can rack up more training, allowing it to recognize a broad variety of pieces on the fly.

Here's another video, focusing on how the pieces move on conveyer belts (running at slow speed so puny humans can follow). You can also see the air jets in action:

In an email interview, Mattheij told Mental Floss that the system currently sorts LEGO bricks into more than 50 categories. It can also be run in a color-sorting mode to bin the parts across 12 color groups. (Thus at present you'd likely do a two-pass sort on the bricks: once for shape, then a separate pass for color.) He continues to refine the system, with a focus on making its recognition abilities faster. At some point down the line, he plans to make the software portion open source. You're on your own as far as building conveyer belts, bins, and so forth.

Check out Mattheij's writeup in two parts for more information. It starts with an overview of the story, followed up with a deep dive on the software. He's also tweeting about the project (among other things). And if you look around a bit, you'll find bulk LEGO brick auctions online—it's definitely a thing!

Original image
Opening Ceremony
fun
arrow
These $425 Jeans Can Turn Into Jorts
May 19, 2017
Original image
Opening Ceremony

Modular clothing used to consist of something simple, like a reversible jacket. Today, it’s a $425 pair of detachable jeans.

Apparel retailer Opening Ceremony recently debuted a pair of “2 in 1 Y/Project” trousers that look fairly peculiar. The legs are held to the crotch by a pair of loops, creating a disjointed C-3PO effect. Undo the loops and you can now remove the legs entirely, leaving a pair of jean shorts in their wake. The result goes from this:

501069-OpeningCeremony2.jpg

Opening Ceremony

To this:

501069-OpeningCeremony3.jpg

Opening Ceremony

The company also offers a slightly different cut with button tabs in black for $460. If these aren’t audacious enough for you, the Y/Project line includes jumpsuits with removable legs and garter-equipped jeans.

[h/t Mashable]

SECTIONS
BIG QUESTIONS
BIG QUESTIONS
JOB SECRETS
QUIZZES
WORLD WAR 1
SMART SHOPPING
STONES, BONES, & WRECKS
#TBT
THE PRESIDENTS
WORDS
RETROBITUARIES