Original image
Michael Lombardi in the Exosuit. Photo by Jim Clark/AMNH.

The High-Tech Exosuit That Takes Divers to 1000 Feet

Original image
Michael Lombardi in the Exosuit. Photo by Jim Clark/AMNH.

It looks like something you'd wear to visit the Moon or Mars, but the Exosuit—on display at the American Museum of Natural History's Milstein Hall of Ocean Life through March 5—is actually built to explore another place that's largely alien to humans: the ocean. The atmospheric diving system (ADS) is capable of taking a diver down to 1000 feet while keeping him at surface pressure. A hybridization of wet diving and submersibles, "it allows the human form to be embedded in an environment," says Michael Lombardi, AMNH's Dive Safety Officer and the project coordinator of the Stephen J. Barlow Expedition, which will take the suit out this July on its first mission to explore an area 100 miles of the coast of New England known as The Canyons. "People have dived to these depths just to say that they've done it," Lombardi says. "That's very different than doing it for work, which is what we're doing."

Click to enlarge

At 6.5 feet tall, the hard-metal suit is owned by the J.F. White Contracting Company and was designed and built by Nuytco Research Ltd.; it's currently the only Exosuit in existence. The suit—which can be modified to fit divers from 5'6" to 6'4" tall—is driven with four 1.6 horsepower foot-controlled thrusters and has 18 rotary joints in the arms and legs, which allow for a wide range of movement and give the diver the ability to use special accessories. Though it weighs between 500 and 600 pounds on land, it's nearly neutrally buoyant in the ocean.

On its July expedition to The Canyons (where the continental shelf drops off to depths of more than 10,000 feet), the suit will allow a team of scientists—including ichthyologists, neurologists, and marine biologists—to conduct studies in the mesopelagic (or mid-water) zone, where they can find a number of animals that have only been studied using remotely operated vehicles (ROVs) or after being caught in trawl nets. The mission will take place at night, because animals make a vertical migration from the depths to shallower water at that time. The team is looking to study creatures that exhibit bioluminescence (generating light using a chemical reaction). The discovery of green fluorescent protein in the '60s allowed scientists to reveal the inner working of cells in a non-invasive way, according to Vincent Pieribone, Yale University School of Medicine Professor and Chief Scientist of the Stephen J. Barlow Bluewater Expedition; identifying new bioluminescent proteins could potentially help in other areas of biomedical research, including cancer cell tagging.

Working in tandem with an ROV, the suit will be equipped with suction tools and a special containment vessel (still in development) that will allow the operator to gently capture fish and invertebrates and place them in front of the ROV's cameras to be photographed in high resolution. The suit is so dexterous that a user can pick up a dime off the floor of a pool—and it has to be, when working in areas where there might be 9000 feet of water below it. "If you drop something," Pieribone says, "that's a long way down." The Exosuit allows a diver to work for 4 to 5 hours on site, and is built to have 50 hours of emergency support.

The back of the Exosuit, which shows the life support system. Photo courtesy AMNH/Michael Lombardi.

The suit itself cost approximately $600,000 to make; add in instrumentation, and the total cost is somewhere around $1.3 million. In development for about 15 years, Lombardi said, the Exosuit is a "quantum leap forward" from the Newtsuit of the 1980s (which was also manufactured by Nuytco and is still used today).

Original image
iStock // Ekaterina Minaeva
Man Buys Two Metric Tons of LEGO Bricks; Sorts Them Via Machine Learning
May 21, 2017
Original image
iStock // Ekaterina Minaeva

Jacques Mattheij made a small, but awesome, mistake. He went on eBay one evening and bid on a bunch of bulk LEGO brick auctions, then went to sleep. Upon waking, he discovered that he was the high bidder on many, and was now the proud owner of two tons of LEGO bricks. (This is about 4400 pounds.) He wrote, "[L]esson 1: if you win almost all bids you are bidding too high."

Mattheij had noticed that bulk, unsorted bricks sell for something like €10/kilogram, whereas sets are roughly €40/kg and rare parts go for up to €100/kg. Much of the value of the bricks is in their sorting. If he could reduce the entropy of these bins of unsorted bricks, he could make a tidy profit. While many people do this work by hand, the problem is enormous—just the kind of challenge for a computer. Mattheij writes:

There are 38000+ shapes and there are 100+ possible shades of color (you can roughly tell how old someone is by asking them what lego colors they remember from their youth).

In the following months, Mattheij built a proof-of-concept sorting system using, of course, LEGO. He broke the problem down into a series of sub-problems (including "feeding LEGO reliably from a hopper is surprisingly hard," one of those facts of nature that will stymie even the best system design). After tinkering with the prototype at length, he expanded the system to a surprisingly complex system of conveyer belts (powered by a home treadmill), various pieces of cabinetry, and "copious quantities of crazy glue."

Here's a video showing the current system running at low speed:

The key part of the system was running the bricks past a camera paired with a computer running a neural net-based image classifier. That allows the computer (when sufficiently trained on brick images) to recognize bricks and thus categorize them by color, shape, or other parameters. Remember that as bricks pass by, they can be in any orientation, can be dirty, can even be stuck to other pieces. So having a flexible software system is key to recognizing—in a fraction of a second—what a given brick is, in order to sort it out. When a match is found, a jet of compressed air pops the piece off the conveyer belt and into a waiting bin.

After much experimentation, Mattheij rewrote the software (several times in fact) to accomplish a variety of basic tasks. At its core, the system takes images from a webcam and feeds them to a neural network to do the classification. Of course, the neural net needs to be "trained" by showing it lots of images, and telling it what those images represent. Mattheij's breakthrough was allowing the machine to effectively train itself, with guidance: Running pieces through allows the system to take its own photos, make a guess, and build on that guess. As long as Mattheij corrects the incorrect guesses, he ends up with a decent (and self-reinforcing) corpus of training data. As the machine continues running, it can rack up more training, allowing it to recognize a broad variety of pieces on the fly.

Here's another video, focusing on how the pieces move on conveyer belts (running at slow speed so puny humans can follow). You can also see the air jets in action:

In an email interview, Mattheij told Mental Floss that the system currently sorts LEGO bricks into more than 50 categories. It can also be run in a color-sorting mode to bin the parts across 12 color groups. (Thus at present you'd likely do a two-pass sort on the bricks: once for shape, then a separate pass for color.) He continues to refine the system, with a focus on making its recognition abilities faster. At some point down the line, he plans to make the software portion open source. You're on your own as far as building conveyer belts, bins, and so forth.

Check out Mattheij's writeup in two parts for more information. It starts with an overview of the story, followed up with a deep dive on the software. He's also tweeting about the project (among other things). And if you look around a bit, you'll find bulk LEGO brick auctions online—it's definitely a thing!

Original image
Name the Author Based on the Character
May 23, 2017
Original image