CLOSE
YouTube // Damien Henry
YouTube // Damien Henry

Take a Trippy Journey in This Machine-Generated Video

YouTube // Damien Henry
YouTube // Damien Henry

Coder Damien Henry created a 56-minute film based on one starting image and a machine learning algorithm. He trained the machine using video shot from the window of a moving train. Then he handed off that first frame and had the algorithm generate what it thought might be a "next" frame. That process repeated serially for the entire film, resulting in this beautiful, abstract train ride:

The soundtrack is Steve Reich's classic "Music for 18 Musicians," and the pairing helps create a mesmerizing atmosphere. Watching the video, you see smears of light and dark eventually form into landscapes (drawing on the algorithm's knowledge of landscapes), but those landscapes are often messy and surreal, looking like blobs in a lava lamp or perhaps a robot's low-fi idea of what a landscape might look like. Because the film includes zero editing, it is purely a product of that first frame and the machine's training. It's beautiful.

In the YouTube description, Henry wrote (in part):

The results are low resolution, blurry, and not realistic most of the time. But it resonates with the feeling I have when I travel in a train. It means that the algorithm learned the patterns needed to create this feeling. Unlike classical computer generated content, these patterns are not chosen or written by a software engineer.

In this video, nobody made explicit that the foreground should move faster than the background: thanks to Machine Learning, the algorithm figured that itself. The algorithm can find patterns that a software engineer may haven’t noticed, and is able to reproduce them in a way that would be difficult or impossible to code.

He also notes that the algorithm learns during the video's creation, which accounts for the increase in realism as the video goes on. He notes that the algorithm's learning system is updated every 20 seconds.

For a bit more from Henry on the project, check out his Twitter feed.

nextArticle.image_alt|e
iStock
arrow
science
Can You 'Hear' These Silent GIFs?
iStock
iStock

GIFs are silent—otherwise they wouldn't be GIFs. But some people claim to hear distinct noises accompanying certain clips. Check out the GIF below as an example: Do you hear a boom every time the structure hits the ground? If so, you may belong to the 20 to 30 percent of people who experience "visual-evoked auditory response," also known as vEAR.

Researchers from City University London recently published a paper online on the phenomenon in the journal Cortex, the British Psychological Society's Research Digest reports. For their study, they recruited more than 4000 volunteers and 126 paid participants and showed them 24 five-second video clips. Each clip lacked audio, but when asked how they rated the auditory sensation for each video on a scale of 0 to 5, 20 percent of the paid participants rated at least half the videos a 3 or more. The percentage was even higher for the volunteer group.

You can try out the researchers' survey yourself. It takes about 10 minutes.

The likelihood of visual-evoked auditory response, according to the researchers, directly relates to what the subject is looking at. "Some people hear what they see: Car indicator lights, flashing neon shop signs, and people's movements as they walk may all trigger an auditory sensation," they write in the study.

Images packed with meaning, like two cars colliding, are more likely to trigger the auditory illusion. But even more abstract images can produce the effect if they have high levels of something called "motion energy." Motion energy is what you see in the video above when the structure bounces and the camera shakes. It's why a video of a race car driving straight down a road might have less of an auditory impact than a clip of a flickering abstract pattern.

The researchers categorize vEAR as a type of synesthesia, a brain condition in which people's senses are combined. Those with synesthesia might "see" patterns when music plays or "taste" certain colors. Most synesthesia is rare, affecting just 4 percent of the population, but this new study suggests that "hearing motion synesthesia" is much more prevalent.

[h/t BPS Research Digest]

nextArticle.image_alt|e
iStock
arrow
Live Smarter
The Google Docs Audio Hack You Might Not Know About
iStock
iStock

To the uninitiated, Google Docs may take some warming up to. But although it may seem like any other word processor, Docs offers its fair share of nifty features that can make your life a whole lot easier. The only problem is that few people seem to know about them.

The Voice Typing function is one such example. As Quartz discovered, this tool can be used to drastically cut down on the time it takes to transcribe an interview or audio recording—a feature that professionals from many fields could benefit from. Voice Typing might also be useful to those who prefer to dictate what they want to write, as well as those with impairments that prevent them from typing.

Whatever the case may be, it's extremely easy to use. Just open a blank document, click on "tools" at the top, and then select "voice typing." A microphone icon will pop up, allowing you to choose your language. After you've done that, simply click the icon when you're ready to start speaking!

Unfortunately, it's unable to pick up an audio recording played through speakers, so you'll need to grab a pair of headphones, plug them into your phone or voice recorder, and dictate what's said as you listen along. Still, this eliminates the hassle of having to pause and rewind in order to let your fingers catch up to the audio—unless you're the champion of a speed typing contest, in which case you probably don't need this tutorial.

According to Quartz, the transcription is "shockingly" accurate, even getting the spelling of last names right. For a how-to guide on the Voice Typing tool, check out Quartz's video below.

[h/t Quartz]

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios