CLOSE
Original image

Why Dogs Hate Halloween

Original image

Dog blogs always get emailed lots of photos, and, naturally, the Halloween season brings about a barrage of photos of costumed pups. Being one of the largest pet blogs on the net, Dogster's blog decided to collect some of the most embarrassing, yet most endearing photos that were sent to their inbox this holiday season.  From Star Wars characters to pups in banana costumes, the results are priceless.


The French maid costume is pretty funny, but it brings up a weird issue recently brought up over at Consumerist, why would you want to put your dog in sexy clothes? I know I'm not the only one who finds this a little creepy. Then again, some people just like to weird people out with weird and creepy costumes.

See Also: Our Readers' Favorite Halloween Costumes

Original image
iStock
arrow
technology
AI Algorithm Tells You the Ingredients in Your Meal Based on a Picture
Original image
iStock

Your food photography habit could soon be good for more than just updating your Instagram. As Gizmodo reports, a new AI algorithm is trained to analyze food photos and match them with a list of ingredients and recipes.

The tool was developed by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). To build it, they compiled information from sites like All Recipes and Food.com into a database dubbed Recipe1M, according to their paper. With more than a million annotated recipes at its disposal, a neural network then sifted through each one, learning about which ingredients are associated with which types of images along the way.

The result is Pic2Recipe, an algorithm that can deduce key details about a food item just by looking at its picture. Show it a picture of a cookie, for example, and it will tell you it likely contains sugar, butter, eggs, and flour. It will also recommend recipes for something similar pulled from the Recipe1M database.

Pic2Recipe is still a work in progress. While it has had success with simple recipes, more complicated items—like smoothies or sushi rolls, for example—seem to confuse the system. Overall, it suggests recipes with an accuracy rate of about 65 percent.

Researchers see their creation being used as a recipe search engine or as a tool for situations where nutritional information is lacking. “If you know what ingredients went into a dish but not the amount, you can take a photo, enter the ingredients, and run the model to find a similar recipe with known quantities, and then use that information to approximate your own meal,” lead author Nick Hynes told MIT News.

Before taking the project any further, the team plans to present its work at the Computer Vision and Pattern Recognition Conference in Honolulu later this month.

[h/t Gizmodo]

Original image
arrow
Health
UV Photos Show the Areas We Miss When Applying Sunscreen
Original image

Sunscreen only works if you're actually wearing it. And it's too easy to go through the motions of putting on sunscreen while still leaving large amounts of skin unprotected. Even if you're applying the recommended shot glass of sunscreen before you head out into the world, parts of your skin may still be exposed to harmful rays. Just check out these UV images taken by researchers at the University of Liverpool, spotted by the UK's Metro.

The black-and-white images were taken with a UV camera so that any part of the skin covered by UV-blocking sunscreen would appear dark. Skin without sunscreen on it, by contrast, remains visible. The 57 volunteers in the study—which was recently presented at the British Association of Dermatologists' Annual Conference—were instructed to apply sunscreen to their face as usual.

A black-and-white UV photo of a woman’s blotchy sunscreen application

Some volunteers were more thorough than others, but as a whole, the group ended up missing a median of 9.5 percent of their faces. Men with beards tended to miss a lot of their faces, you might notice in the photos, and people seemed to have trouble with covering the full area around their mouth. However, the main problems occurred around the eyes. Many people missed their eyelids, and more than three-quarters of the group missed the medial canthal region, or the area between the bridge of the nose and the inner corner of the eye.

A UV photo of a man shows white patches of bare skin underneath dark-looking sunscreen.

The finding is significant because the area around the eyes are particularly susceptible to skin cancer. According to the abstract presented at the conference, 5 to 10 percent of skin cancers occur on the eyelids.

Knowing this doesn't necessarily help, though. When the participants were brought back for a second visit, the researchers gave them new instructions that included data on cancer risks for eyelids, the results barely changed. People put slightly more sunscreen on around their eyelids (they missed a median 7.7 percent instead of 13.5 percent of the area) but almost everyone still missed their medial canthal area.

A woman turns her face to show sunscreen coverage in a UV image.

It's not a surprising finding, considering the fact that no one wants to get sunscreen in their eyes. Sunscreen manufacturers recommend that you keep it out of your eyes, and if it does run, you'll end up in tears. So it's not particularly useful to tell people they should be coating their eyelids in Coppertone.

To keep your face super smooth and reduce your likelihood of sun damage, then, the message is clear. Better get some shades, unless you've got a UV-blocking eyeshadow on hand. Better yet, get yourself a hat, too.

[h/t Metro]

All images by Kareem Hassanin, courtesy Kevin Hamill

SECTIONS

More from mental floss studios