TLDR; ICLR Accepted Papers are out; Tensorflow Dev Summit livestream on February 15; TensorFlow Fold

The Wild Week in AI The Wild Week in AI is a weekly AI & Deep Learning newsletter curated by @ dennybritz

News

ICLR 2017 Accepted Papers openreview.net – Share

Out of 507 submissions, 15 oral presentations and 181 posters were accepted into the conference ( see this chart ). If you are looking for a way to build up your Deep Learning paper reading list, starting with the accepted papers is probably a good bet.

TensorFlow Dev Summit Livestream (February 15) events.withgoogle.com – Share Watch the TensorFlow Dev Summit live on February 15, starting 9:30am PST time. No worries if you can’t make it - all talks are recorded and will be available online afterwards.

Machine Learning @Scale 2017 Videos code.facebook.com – Share Machine Learning @Scale brought together data scientists, engineers, and researchers to discuss technical challenges in large-scale applied machine learning.

Posts, Articles, Tutorials

Performance of Distributed Deep Learning using ChainerMN chainer.org – Share An interesting ImageNet classification benchmark using 1 to 128 GPUs on various Deep Learning frameworks. Chainer comes out on top, and TensorFlow trails behind others. Note however that there are various caveats to such evaluations, such as frameworks like TensorFlow running in distributed mode, non-optimized model code, or outdated framework versions.

Model Mis-specification and Inverse Reinforcement Learning jsteinhardt.wordpress.com – Share Learn about pitfalls in using IRL to infer (not directly observed) human values represented by a reward/utility functions.

Code, Projects & Data

YouTube-BoundingBoxes Dataset research.google.com – Share YouTube-BoundingBoxes is a large-scale data set of video URLs with densely-sampled high-quality single-object bounding box annotations. The data set consists of approximately 380,000 15-20s video segments extracted from 240,000 different publicly visible YouTube videos. Also check out the paper

PyTorch Implementation: seq2seq Translation github.com – Share Implementation of the popular seq2seq with attention architecture in PyTorch. If you also need to catch up on PyTorch this is a great starting point showcasing the important features.

Highlighted Research Papers

[1701.09175] Skip Connections as Effective Symmetry-Breaking arxiv.org – Share A novel explanation of the benefits of skip connections in training very deep neural networks. The authors argue that skip connections help break symmetries inherent in the loss landscapes of deep networks, leading to simplified landscapes.

Did you enjoy this issue?

If you don't want these updates anymore, please unsubscribe here If you were forwarded this newsletter and you like it, you can subscribe here