Marching neural network

Visualizing level surfaces of a neural network with raymarching

Artificial neural networks became very popular in recent years, mostly because of their success in tasks of image and speech recognition. Research in this area started more than 60 years ago and many different network architectures were developed during first decades of research, the only architecture that became popular in applications is MLP (multilayer perceptron) — parametrized multilayer functions trained (optimized) with variations of gradient descent. Later based on MLP approach of training application-specific architectures emerged (such as convolutional and recurrent networks).

Probably it is a good idea to understand the behavior of a neural network by visualizing it. While dependencies modelled in machine learning, in particular by neural networks, are multidimensional, we are limited in our visualization abilities to three dimensions.

In the demo below you can play with a very small MLP with three inputs (x, y, z) and observe resulting functions (just to remind, MLP is a neat function) to see how flexible it is.

Visualization requires not-too-old browser and GPU acceleration. Best on laptops and PCs, but also works on modern mobile devices.