Decoupled Neural Interfaces using Synthetic Gradients

Authors: Max Jaderberg, Wojciech Marian Czarnecki, Simon Osindero, Oriol Vinyals, Alex Graves, David Silver, Koray Kavukcuoglu

When training neural networks, the modules (layers) are locked: they can only be updated after backpropagation. We remove this constraint by incorporating a learnt model of error gradients, Synthetic Gradients, which means we can update networks without full backpropagation. We show how this can be applied to feed-forward networks which allows every layer to be trained asynchronously, to RNNs which extends the time over which models can remember, and to multi-network systems to allow communication.



For further details and related work, please see the paper.

Check it out at ICML:



Monday 07 August, 10:30-10:48 @ Darling Harbour Theatre (Talk)

Monday 07 August, 18:30-22:00 PM @ Gallery #1 (Poster)