NIPS 2016 — Day 2 Highlights: Platform wars, RL and RNNs

Ross Fadely
Insight
Published in
3 min readDec 7, 2016

--

Jeremy Karnowski & Ross Fadely, Insight Artificial Intelligence

Missed our highlights from Day 1 of NIPS 2016? Read here. Want to learn about applied Artificial Intelligence from leading practitioners in Silicon Valley or New York? Learn more about the Insight Artificial Intelligence Fellows Program.

Research Platform Battles Heat Up

Along with the currently rapid growing interest in AI, there is a rapidly growing tension. On the one hand, top research labs in AI are becoming more open — publishing results to ArXiv and pushing code to github. On the other, competition is becoming more fierce to become the dominant player. From deep learning frameworks, to cloud computing platforms, to customized hardware, the battle is on for who will become the standard for the near (and possibly long term) future of AI technologies.

DeepMind’s Lab used to train an AI agent to navigate a Labyrinth maze.

Day 2 at NIPS was no exception to these trends. At the end of the first invited talk of the day, DeepMind announced its new open-source Reinforcement Learning platform DeepMind Lab. The aim of their new platform is to provide a means to build rich simulated environments which can serve as laboratories for AI research. Turns out DeepMind has been using its lab for quite some time, and now are opening it up to the community. Exciting, right?

While DeepMind’s Lab might be one of the newest, it is not the only player in the AI research platform space. The most popular general platform, perhaps, is OpenAI’s Gym which has received significant interest in the community along with many research contributions. Just a few weeks ago OpenAI announced its Universe platform, with the goal of offering more flexibility and extensibility than their Gym.

Going back further, research giants Microsoft and Facebook have already carved their place in the space. In June 2015, Microsoft launched Project Malmo as a AI platform built on top of Minecraft. Similarly, later in 2015 Facebook open-sourced CommAI-env, a lower-level platform for building out AI research environments.

While it is unclear who (if anyone) will win out and become the dominant AI research platform, we are excited. Perhaps even more than the current battle amongst deep learning frameworks, new AI platform efforts are fostering an environment of opportunity and openness which we think will bear fruit if continued. We hope it does.

Other quick highlights

Less controversial but equally as exciting were the incredibly impressive research results presented in Day 2. Themes continued along improvements in Reinforcement Learning and Deep Learning, as well as more broadly used machine learning techniques and their applications. Here is a shortlist that caught our eye:

  • The NIPS award-winning work on Value Iteration Networks was incredibly impressive. The key innovation here is that such models include a differentiable “planning module” which allows networks to make plans and better generalize to unseen domains.
  • Two fantastic results pushing forward Recurrent Neural Networks (RNNs): Sequential Neural Models with Stochastic Layers and Phased LSTMs. The former combines ideas from State Space Models (formally best in class for stochastic sequences like audio) and RNNs, leveraging the best of both worlds. The latter adds a “time gate” to LSTMs which greatly improves optimization and performance for long sequence data.
  • A team from Amazon talked about Bayesian Intermittent Demand Forecasting for Large Inventories (paper). During the talk they showed impressive forecasting (at scale) for problems with intermittent or bursty conditions (think large distributed warehouse inventories).
  • K-means is a core algorithm for many data science applications. However, finding good cluster centers often relies on having good initializations. Talking about his work Fast and Provably Good Seedings for k-Means (paper), Olivier Bachem showed they can get good centroid seeds orders of magnitude faster than the previous state-of-the-art (k-Means++). Even better is that they have code, “pip install kmc2” = g2g.

--

--