How To (Re)Design an ATM

In the fall of 2005, Wells Fargo hired a design firm to redesign the user interface on their ATMs. The old design was deemed clunky, partly because it had to deal with two kinds of machines: those with touchscreens, and those with buttons along the left and right sides of the screen. The new design would work only on the touchscreen models, introducing a new flexibility in layout and interaction design.

Designer Holger Struppek recounts his experience on the redesign project in his excellent article, That design is money! A better ATM experience from Wells Fargo. The article covers a series of anecdotes about designing a better ATM experience, which astounds me for two reasons: first, that an ATM experience can be good; and second, that the ATM screenshots shown in the piece actually exist in the wild (it's like banking in the not-too-distant future. I'm not a Wells Fargo customer, but this ATM design is tempting...my bank is still stuck in the 80's, apparently. Here's a sample from Struppek's article:

A great feature of the Wells Fargo ATM UI has always been the Quick Cash button. It allows you to quickly withdraw an often-used amount from your checking account with the press of one button. There is no need to go through the steps of selecting an account, selecting an amount, and confirming the transaction. However, few people knew that this feature could be customized with a different amount and account. The functionality was always there, but it required pressing the My ATM Preferences button, followed by a tedious multi-step procedure to change the settings.

We thought that the new UI could be better than that. Instead of just offering generic choices and complicated customization procedures, the ATM should learn by itself what individual customers do most often, and then make those things easier to accomplish.

The new UI still offers the Quick Cash feature, but in a much smarter way. Instead of one Quick Cash button, we introduced a whole column of shortcut buttons that behave somewhat like the History menu in a web browser. It is still possible to customize them through Set My ATM Preferences, but hardly necessary since they always reflect the most recent transactions.

Read the rest for an interesting story on user interface design.

nextArticle.image_alt|e
iStock
arrow
technology
Ruined a Photo By Blinking? Facebook Can Fix It With AI
iStock
iStock

Next time you blink in an otherwise flawless photo, don't be so quick to hit the "delete" button on your phone. As The Verge reports, Facebook is testing a new feature that uses artificial intelligence to make closed eyes look naturally open.

Facebook engineers Brian Dolhansky and Cristian Canton Ferrer described the technology behind the AI in a paper published June 18. They used a type of machine learning called generative adversarial network or GAN. It works by looking at a database of pictures and using that information to generate new imagery where there wasn't any before.

This type of AI has been used to design clothing and video game levels in the past. To get it to work with faces, Facebook engineers showed the system photos taken of people when their eyes were open. After "learning" the subject's eye shape, size, and color, the AI used that data to superimpose a new set of eyes over the blinking lids. The feature still has some trouble working with glasses, long bangs, and pictures taken at an angle, but when it does what it's supposed to, it's hard to tell the photo was ever retouched.

Faces with blinking and open eyes.
Facebook

Facebook isn't the first company to use AI to salvage photographs with closed eyes. In 2017, Adobe added an "Open Closed Eyes" feature to Photoshop Elements that also uses AI to generate a pair of eyes that match those of the blinking subject. For it to work, users first have to show the system several photos of the subject with their eyes open.

Facebook, which already holds a database of pictures of many of its users, seems like a perfect fit for this type of technology. The social media site is still testing it out, but based on the success of early experiments, they may consider making it available to users in the not-too-distant future. And because Facebook owns Instagram, it's possible that the eye-opening feature will eventually be applied to Instagram posts and Stories as well.

[h/t The Verge]

nextArticle.image_alt|e
iStock
arrow
Live Smarter
Apple Wants to Make It Easier for 911 Dispatchers to Figure Out Where You Are In an Emergency
iStock
iStock

A few weeks ago, I dialed 911 from a sidewalk in my neighborhood to alert the police of a lost child who had asked me for help. "What's your location?" the dispatcher asked. I had no idea; it was a small side street whose name I had never bothered to learn. I had to run to the end of the block and stare up at the street sign, and when the dispatcher wasn't familiar with the name, either, I had to spell it out, letter-by-letter.

Soon, it may not be quite so difficult to alert emergency services of your location. The Wall Street Journal reports that a forthcoming update to Apple's iOS will automatically send out your phone's location to emergency call centers when you're on the phone with 911.

The update is part of a partnership with RapidSOS, a technology company founded to make it easier for first responders to reach people in an emergency. It aims to make it as simple to find a 911 caller using a cell phone as it is to find one using a landline.

Landline systems can deliver your exact address to emergency services, but cell phone carriers currently only convey your approximate location, with even less accuracy than Google Maps or Uber can. It might be off by as much as a few hundred yards, which can make a substantial difference if you're waiting for life-saving care. The FCC has ruled that by 2021, all cell phone carriers must be able to locate emergency callers within 165 feet, 80 percent of the time—but that's years away.

The new update would come with iOS 12, which is expected to be released later this year. The data automatically sent by your iOS would be different from that data your cell phone carrier sends. It will use Apple's HELO (Hybridized Emergency Location), a system that estimates location based on cell towers, GPS, and Wi-Fi access, sending that information over to emergency call systems using RapidSOS's technology. RapidSOS isn't used by all 911 call centers in the U.S., but the company reports that it will be used by the majority by the end of the year.

In a press release, Apple promises that user data will only be available for emergency use, and that the responding 911 call center will only have access to your location data for the duration of your call.

I wasn't in a hurry when I called 911, and I had the time and the ability to jog down the street and find a sign to figure out where I was. In most emergency situations, the few extra seconds or minutes it could take to pinpoint your own location might be a matter of life and death. As more Americans give up their landlines and go wireless-only, better emergency services location tech will be vital.

[h/t MarketWatch]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios