iStock
iStock

How Louisville Used GPS to Improve Residents' Asthma

iStock
iStock

Louisville, Kentucky has some of the worst air pollution in the U.S., which is particularly bad news for the 85,000 people in surrounding Jefferson County (about 11 percent of the population [PDF]) who have been diagnosed with asthma.

The air quality situation in Louisville won’t be changing anytime soon, but a new study with sensor-equipped inhalers shows that technology can help people with asthma cope, as CityLab reports. The two-year AIR Louisville project involved the Louisville government, the Institute for Healthy Air Water and Soil, and a respiratory health startup called Propeller, which makes sensors for inhalers that can track location and measure air pollutants, humidity levels, and temperature.

Propeller's inhaler-mounted sensors allowed the researchers to monitor the relationship between asthma attacks and environmental factors and provided new insight on how air quality can change from neighborhood to neighborhood. The sensors—which are already used by doctors, but have never been deployed citywide before—can measure levels of nitrogen oxide, sulfur, ozone, particulate matter, and pollen in the air, plus track location, temperature, and humidity, all of which can impact the risk of asthma attacks. The sensors send Propeller data on when, where, and how many "puffs" patients take to track how often people are resorting to emergency medication.

Propeller sent out app notifications to warn the Louisville program participants of greater risk of an asthma attack on bad air quality days, and showed them where and when the most asthma attacks happened around the city.

An inhaler with a sensor on top of it lies next to a smartphone open to the Propeller app.
Propeller

The Propeller program illuminated just how much more asthma-triggering pollution the city’s west side (predominantly home to poor, African-American residents) faces compared to other neighborhoods. The data also showed that ozone provoked an uptick in asthma attacks throughout the city, namely along highways. The study may end up influencing air quality regulations, since the researchers found that air pollutants became problematic for asthma sufferers even under the legal levels.

The program had huge short-term benefits, too, beyond collecting research for city policies. By the time it ended in late June, the study clearly had a significant impact on the nearly 1200 people with asthma and chronic obstructive pulmonary disease (COPD) who took part. The asthma group showed a decline in average inhaler use after a year. There was an 82 percent decline in people's weekly average uses of rescue inhalers at the 12-month follow-up, and the participants had twice the number of symptom-free days. The majority of participants said they understand their asthma "very well" or "well," can better control it, and feel confident about avoiding a bad asthma attack.

Now that the program is over, the institutions involved are still working to launch new policies based on the results, like creating citywide asthma alerts and planting more trees.

[h/t CityLab]

nextArticle.image_alt|e
iStock
arrow
technology
Ruined a Photo By Blinking? Facebook Can Fix It With AI
iStock
iStock

Next time you blink in an otherwise flawless photo, don't be so quick to hit the "delete" button on your phone. As The Verge reports, Facebook is testing a new feature that uses artificial intelligence to make closed eyes look naturally open.

Facebook engineers Brian Dolhansky and Cristian Canton Ferrer described the technology behind the AI in a paper published June 18. They used a type of machine learning called generative adversarial network or GAN. It works by looking at a database of pictures and using that information to generate new imagery where there wasn't any before.

This type of AI has been used to design clothing and video game levels in the past. To get it to work with faces, Facebook engineers showed the system photos taken of people when their eyes were open. After "learning" the subject's eye shape, size, and color, the AI used that data to superimpose a new set of eyes over the blinking lids. The feature still has some trouble working with glasses, long bangs, and pictures taken at an angle, but when it does what it's supposed to, it's hard to tell the photo was ever retouched.

Faces with blinking and open eyes.
Facebook

Facebook isn't the first company to use AI to salvage photographs with closed eyes. In 2017, Adobe added an "Open Closed Eyes" feature to Photoshop Elements that also uses AI to generate a pair of eyes that match those of the blinking subject. For it to work, users first have to show the system several photos of the subject with their eyes open.

Facebook, which already holds a database of pictures of many of its users, seems like a perfect fit for this type of technology. The social media site is still testing it out, but based on the success of early experiments, they may consider making it available to users in the not-too-distant future. And because Facebook owns Instagram, it's possible that the eye-opening feature will eventually be applied to Instagram posts and Stories as well.

[h/t The Verge]

nextArticle.image_alt|e
iStock
arrow
Live Smarter
Apple Wants to Make It Easier for 911 Dispatchers to Figure Out Where You Are In an Emergency
iStock
iStock

A few weeks ago, I dialed 911 from a sidewalk in my neighborhood to alert the police of a lost child who had asked me for help. "What's your location?" the dispatcher asked. I had no idea; it was a small side street whose name I had never bothered to learn. I had to run to the end of the block and stare up at the street sign, and when the dispatcher wasn't familiar with the name, either, I had to spell it out, letter-by-letter.

Soon, it may not be quite so difficult to alert emergency services of your location. The Wall Street Journal reports that a forthcoming update to Apple's iOS will automatically send out your phone's location to emergency call centers when you're on the phone with 911.

The update is part of a partnership with RapidSOS, a technology company founded to make it easier for first responders to reach people in an emergency. It aims to make it as simple to find a 911 caller using a cell phone as it is to find one using a landline.

Landline systems can deliver your exact address to emergency services, but cell phone carriers currently only convey your approximate location, with even less accuracy than Google Maps or Uber can. It might be off by as much as a few hundred yards, which can make a substantial difference if you're waiting for life-saving care. The FCC has ruled that by 2021, all cell phone carriers must be able to locate emergency callers within 165 feet, 80 percent of the time—but that's years away.

The new update would come with iOS 12, which is expected to be released later this year. The data automatically sent by your iOS would be different from that data your cell phone carrier sends. It will use Apple's HELO (Hybridized Emergency Location), a system that estimates location based on cell towers, GPS, and Wi-Fi access, sending that information over to emergency call systems using RapidSOS's technology. RapidSOS isn't used by all 911 call centers in the U.S., but the company reports that it will be used by the majority by the end of the year.

In a press release, Apple promises that user data will only be available for emergency use, and that the responding 911 call center will only have access to your location data for the duration of your call.

I wasn't in a hurry when I called 911, and I had the time and the ability to jog down the street and find a sign to figure out where I was. In most emergency situations, the few extra seconds or minutes it could take to pinpoint your own location might be a matter of life and death. As more Americans give up their landlines and go wireless-only, better emergency services location tech will be vital.

[h/t MarketWatch]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios