Brian Smale, Microsoft
Brian Smale, Microsoft

MIT Undergrads Invent Compact Device That Translates Text to Braille

Brian Smale, Microsoft
Brian Smale, Microsoft

For years, scientists have been using technology to leap across language barriers. We’ve seen earpieces that translate spoken conversations and gloves that decode sign language, but when it comes to translating braille in real time there are few options available. A group of undergraduates from MIT are looking to change that with a device small enough to fit in your hand, Smithsonian reports.

Five of the six engineering students (Charlene Xia, Grace Li, Chen Wang, Jessica Shi, and Chandani Doshi—Tania Yu joined the project later) first collaborated on the project at MakeMIT’s hackathon as team 100% Enthusiasm in February of last year. The team won the contest with a braille-translating tool they called Tactile. Using an external webcam, Tactile converted printed text to braille. It displayed the translation one character at a time by poking combinations of pins through its plastic surface.

The team has come a long way since creating the initial prototype, with the latest version of Tactile featuring a built-in camera. Users place the compact box directly over the text they wish to translate and press a button to snap a picture. From there, Microsoft’s Computer Vision API translates the words and conveys the message in braille in six-character chunks. The entire process, from taking the picture to raising the pins, takes roughly the same amount of time as flipping a page.

Handheld device translates text into braille.
Rendering shows the students' vision for Tactile.
Brian Smale, Microsoft

Tactile recently earned the women the Lemelson-MIT Student Prize and the $10,000 award that comes with it. They plan to use those funds to refine the product and get it commercial-ready within two years. When it hits shelves, the team hopes to sell the device for less than $200—a fraction of the cost of most high-tech braille translators currently on the market. They’ll also be working on ways to make Tactile smaller (right now it’s about the size of three smartphones sandwiched together) and more user-friendly (ideally it will scan an entire page rather than a few lines at a time, and display 18 characters instead of six).

Microsoft is one of the team’s biggest supporters. They’ve been accepted into Microsoft’s #MakeWhatsNext program, an initiative that offers legal assistance to women inventors seeking patents. “There cannot be enough investment in technology that will enable, empower and allow people with disabilities to go and do amazing things,” Jenny Lay-Flurrie, Microsoft’s chief accessibility officer, is quoted as saying on the program's webpage. “I can’t wait to see where this one goes—and I think the patent is a great next step.”

[h/t Smithsonian]

nextArticle.image_alt|e
iStock
arrow
technology
Ruined a Photo By Blinking? Facebook Can Fix It With AI
iStock
iStock

Next time you blink in an otherwise flawless photo, don't be so quick to hit the "delete" button on your phone. As The Verge reports, Facebook is testing a new feature that uses artificial intelligence to make closed eyes look naturally open.

Facebook engineers Brian Dolhansky and Cristian Canton Ferrer described the technology behind the AI in a paper published June 18. They used a type of machine learning called generative adversarial network or GAN. It works by looking at a database of pictures and using that information to generate new imagery where there wasn't any before.

This type of AI has been used to design clothing and video game levels in the past. To get it to work with faces, Facebook engineers showed the system photos taken of people when their eyes were open. After "learning" the subject's eye shape, size, and color, the AI used that data to superimpose a new set of eyes over the blinking lids. The feature still has some trouble working with glasses, long bangs, and pictures taken at an angle, but when it does what it's supposed to, it's hard to tell the photo was ever retouched.

Faces with blinking and open eyes.
Facebook

Facebook isn't the first company to use AI to salvage photographs with closed eyes. In 2017, Adobe added an "Open Closed Eyes" feature to Photoshop Elements that also uses AI to generate a pair of eyes that match those of the blinking subject. For it to work, users first have to show the system several photos of the subject with their eyes open.

Facebook, which already holds a database of pictures of many of its users, seems like a perfect fit for this type of technology. The social media site is still testing it out, but based on the success of early experiments, they may consider making it available to users in the not-too-distant future. And because Facebook owns Instagram, it's possible that the eye-opening feature will eventually be applied to Instagram posts and Stories as well.

[h/t The Verge]

nextArticle.image_alt|e
iStock
arrow
Live Smarter
Apple Wants to Make It Easier for 911 Dispatchers to Figure Out Where You Are In an Emergency
iStock
iStock

A few weeks ago, I dialed 911 from a sidewalk in my neighborhood to alert the police of a lost child who had asked me for help. "What's your location?" the dispatcher asked. I had no idea; it was a small side street whose name I had never bothered to learn. I had to run to the end of the block and stare up at the street sign, and when the dispatcher wasn't familiar with the name, either, I had to spell it out, letter-by-letter.

Soon, it may not be quite so difficult to alert emergency services of your location. The Wall Street Journal reports that a forthcoming update to Apple's iOS will automatically send out your phone's location to emergency call centers when you're on the phone with 911.

The update is part of a partnership with RapidSOS, a technology company founded to make it easier for first responders to reach people in an emergency. It aims to make it as simple to find a 911 caller using a cell phone as it is to find one using a landline.

Landline systems can deliver your exact address to emergency services, but cell phone carriers currently only convey your approximate location, with even less accuracy than Google Maps or Uber can. It might be off by as much as a few hundred yards, which can make a substantial difference if you're waiting for life-saving care. The FCC has ruled that by 2021, all cell phone carriers must be able to locate emergency callers within 165 feet, 80 percent of the time—but that's years away.

The new update would come with iOS 12, which is expected to be released later this year. The data automatically sent by your iOS would be different from that data your cell phone carrier sends. It will use Apple's HELO (Hybridized Emergency Location), a system that estimates location based on cell towers, GPS, and Wi-Fi access, sending that information over to emergency call systems using RapidSOS's technology. RapidSOS isn't used by all 911 call centers in the U.S., but the company reports that it will be used by the majority by the end of the year.

In a press release, Apple promises that user data will only be available for emergency use, and that the responding 911 call center will only have access to your location data for the duration of your call.

I wasn't in a hurry when I called 911, and I had the time and the ability to jog down the street and find a sign to figure out where I was. In most emergency situations, the few extra seconds or minutes it could take to pinpoint your own location might be a matter of life and death. As more Americans give up their landlines and go wireless-only, better emergency services location tech will be vital.

[h/t MarketWatch]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios