Google subsidiary DeepMind and a new start-up called Osaro are building AI systems that can play Atari video games like Space Invaders, Video Pinball, and Breakout. Researchers at these companies aren’t trying to teach their AI systems to better understand the arcade culture of the '80s—rather, they hope that teaching these AI systems to navigate the digital world of video games will eventually help machines navigate complex human environments like warehouses and factories.
According to WIRED, the AI systems at both companies have gotten good enough that they can sometimes beat professional human players. Osaro’s system uses recurrent neural networks, which mimic the neurons in the human brain and have a form of short-term memory. That’s crucial for these kinds of games, where time is an important component: “You can’t really tell what’s going on in a game just by looking at a single frame,” Osaro CEO Itamar Arel told WIRED. “You need to look at a sequence of frames to know if, say, a ball is going left or right, if it’s accelerating or decelerating.”
Both DeepMind and Osaro’s AI systems learn the games through trial and error. The systems try out different moves until they figure out the ones that work. They’re “rewarded” for successful moves with what WIRED calls “digital dopamine”—essentially, some form of positive reinforcement that lets the software know its on the right track. WIRED explains, “The name Osaro is a nod to this process. It’s short for Observation, State inference, Action, Reward, and—as the loop continues—Observation.”
Ultimately, '80s video games are a simplified stand-in for real environments. DeepMind and Osaro want to build AI systems that can navigate in the real world: In the future, the technology could be used to develop everything from robots used in factories and warehouses to self-driving cars. Shooting down space invaders is just the first step.