We present a novel method for real-time quadruped motion synthesis called Mode-Adaptive Neural Networks. Our system is trained in an end-to-end fashion on unstructured motion capture data, without requiring labels for the phase or locomotion gaits. The system can be used for creating natural animations in games and films, and is the first of such systematic approaches whose quality could be of practical use. It is implemented in the Unity 3D engine and TensorFlow, and published under the ACM Transactions on Graphics / SIGGRAPH 2018.