CLOSE
Original image

The Forgotten Lens: 50mm

Original image

When my fifteenth birthday came around, I wanted just one thing: a real camera. My father gave me his treasured Minolta 201 camera body, a handful of filters, and three lenses. The lenses were a 35mm wide-angle, a 50mm "standard" lens, and a 135mm telephoto. "But you're only going to need the 50mm," Dad said. Why? "Because it mimics the human eye. It makes your photos look natural." Of course, I immediately grabbed the 35mm wide-angle, ignoring him. I liked the wide-angle because I could take photos of people without having to back up, or bother composing the frame (walking around to set up a photo? Total bummer!). I shot a lot of expired Agfachrome (on mega-discount at the local camera shop) with that combination, and even picked up a 28mm wider-angle lens, which suffered from chromatic aberration around the edges.

About a year into my teenage camera adventure, I decided to try out the 50mm that had come so highly recommended. And guess what? Dad was right -- that lens made things look "real" in a way that I hadn't expected. It was far better for taking pictures of people, making their faces look natural. Also, the 50mm happened to be a faster lens, which finally allowed me to explore and begin to understand depth of field -- something I hadn't done with my wide-angle, which I kept locked at f/3.5 (the best it could do). I was shocked to look back at my older photos and see how my wide-angle lens (and the f/3.5 aperture) affected the look of the photos -- it was a distinctive look, but I was no longer sure it was a good one.

Photographer Gary Voth has posted a lovely article on the 50mm lens: The Forgotten Lens. Here's a sample:

The 50mm lens is called a "normal" or "standard" lens because the way it renders perspective closely matches that of the human eye. Consequently, images made with a 50mm lens have a natural and uncontrived look. This is the lens that likely would have come with your camera had you bought it 10-15 years ago. Before falling to its current level of disfavor, the 50mm lens had a long and distinguished pedigree. For many years the defining documentary instrument of the 20th century was the small format rangefinder camera (Leica, Contax, Nikon, Canon) with 50mm lens. Some of the world's best-known photographers such as Henri Cartier-Bresson and Ralph Gibson made virtually their entire careers with this combination.

Check out the rest of the article for a primer on camera lenses, and why you might not want super-zoom or super-wide-angle.

Link via 43 Folders.

Original image
arrow
pretty pictures
9 Exhilarating Close-Up Photos of Sharks
Original image

Dive into the world of Shark, a new book by award-winning photographer Brian Skerry.

Original image
iStock
arrow
technology
AI Algorithm Tells You the Ingredients in Your Meal Based on a Picture
Original image
iStock

Your food photography habit could soon be good for more than just updating your Instagram. As Gizmodo reports, a new AI algorithm is trained to analyze food photos and match them with a list of ingredients and recipes.

The tool was developed by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). To build it, they compiled information from sites like All Recipes and Food.com into a database dubbed Recipe1M, according to their paper. With more than a million annotated recipes at its disposal, a neural network then sifted through each one, learning about which ingredients are associated with which types of images along the way.

The result is Pic2Recipe, an algorithm that can deduce key details about a food item just by looking at its picture. Show it a picture of a cookie, for example, and it will tell you it likely contains sugar, butter, eggs, and flour. It will also recommend recipes for something similar pulled from the Recipe1M database.

Pic2Recipe is still a work in progress. While it has had success with simple recipes, more complicated items—like smoothies or sushi rolls, for example—seem to confuse the system. Overall, it suggests recipes with an accuracy rate of about 65 percent.

Researchers see their creation being used as a recipe search engine or as a tool for situations where nutritional information is lacking. “If you know what ingredients went into a dish but not the amount, you can take a photo, enter the ingredients, and run the model to find a similar recipe with known quantities, and then use that information to approximate your own meal,” lead author Nick Hynes told MIT News.

Before taking the project any further, the team plans to present its work at the Computer Vision and Pattern Recognition Conference in Honolulu later this month.

[h/t Gizmodo]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios