A 3300-Pound Flying Car Could Evacuate People Remotely

By 2020, we could finally have our flying car. A new drone created by the Israeli firm Urban Aeronautics can carry passengers, ferrying up to 1100 pounds at speeds of up to 115 miles per hour, according to Reuters.

Around the size of a regular car, the Cormorant AUV (named after the aquatic bird) completed its first solo flight in November—mostly successfully, although there were some issues with onboard sensors. Although it hasn't yet met all FAA standards, Urban Aeronautics CEO Rafi Yoeli notes that, thanks to the 39 patents they've filed, they're way ahead of the competition.

The vehicle has an estimated cost of about $14 million, and is scheduled for a 2020 release.

The drone uses internal rotors instead of propellers, so unlike a helicopter, it can fly between buildings and among power lines safely. It could potentially be used as a kind of drone ambulance, evacuating the wounded from conflicts or disasters where it might be dangerous to send pilots, or to deliver people into spaces too tight for a helicopter to navigate.

[h/t The Daily Mail]

nextArticle.image_alt|e
iStock
arrow
technology
Ruined a Photo By Blinking? Facebook Can Fix It With AI
iStock
iStock

Next time you blink in an otherwise flawless photo, don't be so quick to hit the "delete" button on your phone. As The Verge reports, Facebook is testing a new feature that uses artificial intelligence to make closed eyes look naturally open.

Facebook engineers Brian Dolhansky and Cristian Canton Ferrer described the technology behind the AI in a paper published June 18. They used a type of machine learning called generative adversarial network or GAN. It works by looking at a database of pictures and using that information to generate new imagery where there wasn't any before.

This type of AI has been used to design clothing and video game levels in the past. To get it to work with faces, Facebook engineers showed the system photos taken of people when their eyes were open. After "learning" the subject's eye shape, size, and color, the AI used that data to superimpose a new set of eyes over the blinking lids. The feature still has some trouble working with glasses, long bangs, and pictures taken at an angle, but when it does what it's supposed to, it's hard to tell the photo was ever retouched.

Faces with blinking and open eyes.
Facebook

Facebook isn't the first company to use AI to salvage photographs with closed eyes. In 2017, Adobe added an "Open Closed Eyes" feature to Photoshop Elements that also uses AI to generate a pair of eyes that match those of the blinking subject. For it to work, users first have to show the system several photos of the subject with their eyes open.

Facebook, which already holds a database of pictures of many of its users, seems like a perfect fit for this type of technology. The social media site is still testing it out, but based on the success of early experiments, they may consider making it available to users in the not-too-distant future. And because Facebook owns Instagram, it's possible that the eye-opening feature will eventually be applied to Instagram posts and Stories as well.

[h/t The Verge]

nextArticle.image_alt|e
iStock
arrow
Live Smarter
Apple Wants to Make It Easier for 911 Dispatchers to Figure Out Where You Are In an Emergency
iStock
iStock

A few weeks ago, I dialed 911 from a sidewalk in my neighborhood to alert the police of a lost child who had asked me for help. "What's your location?" the dispatcher asked. I had no idea; it was a small side street whose name I had never bothered to learn. I had to run to the end of the block and stare up at the street sign, and when the dispatcher wasn't familiar with the name, either, I had to spell it out, letter-by-letter.

Soon, it may not be quite so difficult to alert emergency services of your location. The Wall Street Journal reports that a forthcoming update to Apple's iOS will automatically send out your phone's location to emergency call centers when you're on the phone with 911.

The update is part of a partnership with RapidSOS, a technology company founded to make it easier for first responders to reach people in an emergency. It aims to make it as simple to find a 911 caller using a cell phone as it is to find one using a landline.

Landline systems can deliver your exact address to emergency services, but cell phone carriers currently only convey your approximate location, with even less accuracy than Google Maps or Uber can. It might be off by as much as a few hundred yards, which can make a substantial difference if you're waiting for life-saving care. The FCC has ruled that by 2021, all cell phone carriers must be able to locate emergency callers within 165 feet, 80 percent of the time—but that's years away.

The new update would come with iOS 12, which is expected to be released later this year. The data automatically sent by your iOS would be different from that data your cell phone carrier sends. It will use Apple's HELO (Hybridized Emergency Location), a system that estimates location based on cell towers, GPS, and Wi-Fi access, sending that information over to emergency call systems using RapidSOS's technology. RapidSOS isn't used by all 911 call centers in the U.S., but the company reports that it will be used by the majority by the end of the year.

In a press release, Apple promises that user data will only be available for emergency use, and that the responding 911 call center will only have access to your location data for the duration of your call.

I wasn't in a hurry when I called 911, and I had the time and the ability to jog down the street and find a sign to figure out where I was. In most emergency situations, the few extra seconds or minutes it could take to pinpoint your own location might be a matter of life and death. As more Americans give up their landlines and go wireless-only, better emergency services location tech will be vital.

[h/t MarketWatch]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios