CLOSE
Original image
Stampede Blue

How Does the Magic Yellow First-Down Line Work?

Original image
Stampede Blue

If you attend a Super Bowl party on Sunday, you’ll probably hear at least one casual football viewer ask, “How do they get that yellow first-down line on the field?” While “magic” is a fine answer in its own right, the real explanation is a bit more technologically intense. Let’s have a look at the background and mechanics behind every football fan’s shining beacon, the yellow first-down line.

Like the first-down line, football fans? You owe a tip of your cap to an unlikely source: hockey. According to Allen St. John’s 2009 book The Billion Dollar Game, the first-down line actually emerged from the ashes of one of sports broadcasting’s bigger debacles: the FoxTrax system for hockey, which was designed by a company called Sportvision. FoxTrax – which hockey fans no doubt remember as the much-maligned “technopuck” that debuted in 1996 – employed a system of cameras and sensors around a hockey rink to place a little blue halo around the puck.

FoxTrax wasn't a great fit for NHL broadcasts. Hockey purists hated the intrusion into their game, and casual fans didn’t flock to hockey just because the puck was suddenly easier to follow. However, the system inspired producers to think of new ways to insert computerized images into live sports broadcasts. The idea of using a line to mark the first down in football was a natural extension, and Sportvision debuted its 1st and Ten system during ESPN’s broadcast of a Bengals-Ravens tilt on September 27, 1998. A couple of months later, rival company Princeton Video Image unveiled its Yellow Down Line system during a Steelers-Lions broadcast on CBS. (Sportvision is still kicking, and ESPN acquired all of PVI’s intellectual property in December 2010.)

But How Does It Work?

It takes lots of computers, sensors, and smart technicians. Long before the game starts, technicians make a digital 3-D model of the field, including all of the yard lines. While a football field may look flat to the naked eye, it’s actually subtly curved with a crown in the middle to help rainwater flow away. Each field has its own unique contours, so before the season begins, broadcasters need to get a 3-D model of each stadium’s field.

These models of the field help sidestep the rest of the technological challenges inherent to putting a line on the field. On game day, each camera used in the broadcast contains sensors that record its location, tilt, pan, and zoom and transmit this data to the network’s graphics truck in the stadium’s parking lot. These readings allow the computers in the truck to process exactly where each camera is within the 3-D model and the perspective of each camera. (According to How Stuff Works, the computers recalculate the perspective 30 times per second as the camera moves.)

After they get their hands on all of this information, the folks in the graphics truck know where to put the first-down line, but that’s only part of the task. When you watch games you’ll notice that the first-down line appears to actually be painted on the field; if a player or official crosses the line, he doesn’t turn yellow. Instead, it looks like the player’s cleat is positioned on top of an actual painted line. This effect is fairly straightforward, but it’s difficult to achieve.

To integrate the line onto the field of play, the technicians and their computers put together two separate color palettes before each game. One palette contains the colors – usually greens and browns – that naturally occur on the field’s turf. These colors will automatically be converted into yellow when the line is drawn onto the field. All of the other colors that could show up on the field – things like the players and officials’ uniforms, shoes, and flesh, the ball itself, challenge and penalty flags – go into a separate palette. Colors that appear on this second palette are never converted into yellow when the first-down line is drawn. Thus, if a player’s foot is situated “on” the line, everything around his cleat will turn yellow, but the cleat itself will remain black. According to How Stuff Works, this drawing/colorizing process refreshes sixty times per second.

All this technology—and the people needed to run it—wasn’t cheap at first. It could cost broadcasters anywhere from $25,000 to $30,000 per game to put the yellow line on the field. Sportvision had to deploy a truck and a four-man crew with five racks of equipment. The cost has come down since then, and the process is now less labor intensive. One technician using one or two computers can run the system, according to Sportvision, and some games can even be done without anyone actually at the venue.

Now you can explain it to everyone at your Super Bowl party during one of the less-exciting $4 million commercials.

This post originally appeared in 2011.

Original image
iStock // Ekaterina Minaeva
technology
arrow
Man Buys Two Metric Tons of LEGO Bricks; Sorts Them Via Machine Learning
May 21, 2017
Original image
iStock // Ekaterina Minaeva

Jacques Mattheij made a small, but awesome, mistake. He went on eBay one evening and bid on a bunch of bulk LEGO brick auctions, then went to sleep. Upon waking, he discovered that he was the high bidder on many, and was now the proud owner of two tons of LEGO bricks. (This is about 4400 pounds.) He wrote, "[L]esson 1: if you win almost all bids you are bidding too high."

Mattheij had noticed that bulk, unsorted bricks sell for something like €10/kilogram, whereas sets are roughly €40/kg and rare parts go for up to €100/kg. Much of the value of the bricks is in their sorting. If he could reduce the entropy of these bins of unsorted bricks, he could make a tidy profit. While many people do this work by hand, the problem is enormous—just the kind of challenge for a computer. Mattheij writes:

There are 38000+ shapes and there are 100+ possible shades of color (you can roughly tell how old someone is by asking them what lego colors they remember from their youth).

In the following months, Mattheij built a proof-of-concept sorting system using, of course, LEGO. He broke the problem down into a series of sub-problems (including "feeding LEGO reliably from a hopper is surprisingly hard," one of those facts of nature that will stymie even the best system design). After tinkering with the prototype at length, he expanded the system to a surprisingly complex system of conveyer belts (powered by a home treadmill), various pieces of cabinetry, and "copious quantities of crazy glue."

Here's a video showing the current system running at low speed:

The key part of the system was running the bricks past a camera paired with a computer running a neural net-based image classifier. That allows the computer (when sufficiently trained on brick images) to recognize bricks and thus categorize them by color, shape, or other parameters. Remember that as bricks pass by, they can be in any orientation, can be dirty, can even be stuck to other pieces. So having a flexible software system is key to recognizing—in a fraction of a second—what a given brick is, in order to sort it out. When a match is found, a jet of compressed air pops the piece off the conveyer belt and into a waiting bin.

After much experimentation, Mattheij rewrote the software (several times in fact) to accomplish a variety of basic tasks. At its core, the system takes images from a webcam and feeds them to a neural network to do the classification. Of course, the neural net needs to be "trained" by showing it lots of images, and telling it what those images represent. Mattheij's breakthrough was allowing the machine to effectively train itself, with guidance: Running pieces through allows the system to take its own photos, make a guess, and build on that guess. As long as Mattheij corrects the incorrect guesses, he ends up with a decent (and self-reinforcing) corpus of training data. As the machine continues running, it can rack up more training, allowing it to recognize a broad variety of pieces on the fly.

Here's another video, focusing on how the pieces move on conveyer belts (running at slow speed so puny humans can follow). You can also see the air jets in action:

In an email interview, Mattheij told Mental Floss that the system currently sorts LEGO bricks into more than 50 categories. It can also be run in a color-sorting mode to bin the parts across 12 color groups. (Thus at present you'd likely do a two-pass sort on the bricks: once for shape, then a separate pass for color.) He continues to refine the system, with a focus on making its recognition abilities faster. At some point down the line, he plans to make the software portion open source. You're on your own as far as building conveyer belts, bins, and so forth.

Check out Mattheij's writeup in two parts for more information. It starts with an overview of the story, followed up with a deep dive on the software. He's also tweeting about the project (among other things). And if you look around a bit, you'll find bulk LEGO brick auctions online—it's definitely a thing!

Original image
Nick Briggs/Comic Relief
entertainment
arrow
What Happened to Jamie and Aurelia From Love Actually?
May 26, 2017
Original image
Nick Briggs/Comic Relief

Fans of the romantic-comedy Love Actually recently got a bonus reunion in the form of Red Nose Day Actually, a short charity special that gave audiences a peek at where their favorite characters ended up almost 15 years later.

One of the most improbable pairings from the original film was between Jamie (Colin Firth) and Aurelia (Lúcia Moniz), who fell in love despite almost no shared vocabulary. Jamie is English, and Aurelia is Portuguese, and they know just enough of each other’s native tongues for Jamie to propose and Aurelia to accept.

A decade and a half on, they have both improved their knowledge of each other’s languages—if not perfectly, in Jamie’s case. But apparently, their love is much stronger than his grasp on Portuguese grammar, because they’ve got three bilingual kids and another on the way. (And still enjoy having important romantic moments in the car.)

In 2015, Love Actually script editor Emma Freud revealed via Twitter what happened between Karen and Harry (Emma Thompson and Alan Rickman, who passed away last year). Most of the other couples get happy endings in the short—even if Hugh Grant's character hasn't gotten any better at dancing.

[h/t TV Guide]

SECTIONS
BIG QUESTIONS
BIG QUESTIONS
WEATHER WATCH
BE THE CHANGE
JOB SECRETS
QUIZZES
WORLD WAR 1
SMART SHOPPING
STONES, BONES, & WRECKS
#TBT
THE PRESIDENTS
WORDS
RETROBITUARIES