Neural Networks

The Wolfram Language has state-of-the-art capabilities for the construction, training and deployment of neural network machine learning systems. Many standard layer types are available and are assembled symbolically into a network, which can then immediately be trained and deployed on available CPUs and GPUs.

Automated Machine Learning

Classify — automatic training and classification using neural networks and other methods

Predict — automatic training and data prediction

FeatureExtraction — automatic feature extraction from image, text, numeric, etc. data

LearnDistribution — automatic learning of data distribution

ImageIdentify — fully trained image identification for common objects

Prebuilt Material

NetModel — complete pre-trained net models

ResourceData — access to training data, networks, etc.

Net Representation

NetGraph — symbolic representation of trained or untrained net graphs to be applied to data

NetChain — symbolic representation of a simple chain of net layers

NetPort — symbolic representation of a named input or output port for a layer

NetExtract — extract properties and weights etc. from nets

Information — give summary and detailed information about any net

Net Operations

NetTrain — train parameters in a net from examples

NetInitialize — randomly initialize parameters for a net

NetPortGradient — differentiate a net with respect to a port

NetStateObject — store and reuse recurrent state in a net

NetTrainResultsObject — represent what happened in net training

NetMeasurements — measure the performance of a net on test data

NetEvaluationMode ▪ TargetDevice

Basic Layers

LinearLayer — trainable layer with dense connections computing

ElementwiseLayer — apply a specified function to each element in a tensor

SoftmaxLayer — layer globally normalizing elements to the unit interval

Elementwise Computation Layers

ElementwiseLayer ▪ ParametricRampLayer ▫ ThreadingLayer ▪ ConstantTimesLayer ▪ ConstantPlusLayer

Structure Manipulation Layers

CatenateLayer ▪ PrependLayer ▪ AppendLayer ▪ FlattenLayer ▪ ReshapeLayer ▪ ReplicateLayer ▪ PaddingLayer ▪ PartLayer ▪ TransposeLayer ▪ ExtractLayer

Array Operation Layers

ConstantArrayLayer — embed a learned constant array into a NetGraph

SummationLayer ▪ TotalLayer ▪ AggregationLayer ▪ DotLayer ▪ OrderingLayer

Convolutional and Filtering Layers

ConvolutionLayer ▪ DeconvolutionLayer ▪ PoolingLayer ▪ ResizeLayer ▪ SpatialTransformationLayer

Recurrent Layers

BasicRecurrentLayer ▪ GatedRecurrentLayer ▪ LongShortTermMemoryLayer

Sequence-Handling Layers

UnitVectorLayer — embed integers into one-hot vectors

EmbeddingLayer — embed integers into trainable vector spaces

AttentionLayer — trainable layer for finding parts of a sequence to attend to

SequenceLastLayer ▪ SequenceReverseLayer ▪ SequenceMostLayer ▪ SequenceRestLayer ▪ AppendLayer ▪ PrependLayer

Training Optimization Layers

DropoutLayer ▪ ImageAugmentationLayer

BatchNormalizationLayer ▪ NormalizationLayer ▪ LocalResponseNormalizationLayer

Loss Layers

CrossEntropyLossLayer ▪ ContrastiveLossLayer ▪ CTCLossLayer

MeanSquaredLossLayer ▪ MeanAbsoluteLossLayer

Higher-Order Network Construction

NetMapOperator — map over a sequence

NetMapThreadOperator — map over multiple sequences

NetFoldOperator — recurrent network that folds in elements of a sequence

NetBidirectionalOperator — bidirectional recurrent network

NetNestOperator — apply the same operation multiple times

Network Composition

NetChain — chain composition of net layers

NetGraph — graph of net layers

NetPairEmbeddingOperator — train a Siamese neural network

NetGANOperator — train generative adversarial networks (GAN)

Network Surgery

NetDrop ▪ NetTake ▪ NetAppend ▪ NetPrepend ▪ NetJoin

NetDelete ▪ NetInsert ▪ NetReplace ▪ NetReplacePart

NetFlatten ▪ NetRename

Weight Sharing

NetSharedArray — represent an array shared between several layers

NetInsertSharedArrays — convert all arrays in a net into shared arrays

Encoding & Decoding

NetEncoder — convert images, categories, etc. to net-compatible numerical arrays

"Audio" ▪ "AudioMelSpectrogram" ▪ "AudioMFCC" ▪ "AudioSpectrogram" ▪ "AudioSTFT" ▪ "Boolean" ▪ "Characters" ▪ "Class" ▪ "Function" ▪ "Image" ▪ "Image3D" ▪ "Tokens" ▫ "BPESubwordTokens" ▫ "UTF8"

NetDecoder — interpret net-generated numerical arrays as images, probabilities, etc.

"Boolean" ▪ "Characters" ▪ "Class" ▪ "CTCBeamSearch" ▪ "Image" ▪ "Function" ▪ "Image3D" ▪ "Tokens" ▫ "BPESubwordTokens"

Activation Functions

Ramp — rectified linear (ReLU)

ParametricRampLayer — parametric and leaky rectified linear (ReLU)

Tanh ▪ LogisticSigmoid ▪ Exp ▪ Log ▪ Sin ▪ Cos ▪ Sqrt ▪ Abs

Importing & Exporting

"WLNet" — Wolfram Language Net representation format

"MXNet" — MXNet net representation format

Import ▪ Export

Managing Data & Training

NetMeasurements — measure the performance of a net on test data

BatchSize ▫ LearningRate ▫ LossFunction ▪ NetEvaluationMode ▪ RandomSeeding ▫ TargetDevice ▪ ValidationSet

TrainingProgressFunction ▪ TrainingProgressCheckpointing ▪ TrainingProgressReporting ▪ TrainingProgressMeasurements ▫ TrainingStoppingCriterion

LearningRateMultipliers — specify learning rate multiplier for subparts of a net

TrainingUpdateSchedule — control which subparts of a net are updated at each iteration of the training

DeleteMissing — remove missing data before training

Reinforcement Learning Environments

"OpenAIGym", ... — access to video games and many other test environments