Building The Network

So now that we have finally cleaned up our data we are ready dive into the good stuff! Neural Networks are an exciting area of machine learning and have grown massively over the past several years. However they can also be quite complicated and involve a lot of maths. This notebook will illustrate the building of the network and the theory behind, but wont go into too much detail on the low level maths. For more information on the math side of things checkout 3 blue 1 browns great video

Below is the full project code. After the code is run and the results are displayed the notebook will explain each of the sections.

What Is A Neural Network?

Like the name suggests, neural networks are designed after the architecture of our brains. Eletrical signals are sent through layers of neurons in our brains that are all connected together and eventually forms an output that our brain can understand and act on. Just like our brains architecture, our network has several layers made up of perceptron nodes(neurons).

Each of these nodes simply holds a number. These nodes then have their numbers multiplied by the weights connecting them to the next layer. The data is inputed into the first layer of the network, there is a node for every feature in the data set. These inputs are passed through each layer of the network with the layers weights applied to them. After the inputs make it through the entire network we are left with our output. In this case we have a single output node that will be either 1 or 0(Malignent or Benign). This process is known as the forward pass. The next step is the backward pass. This is where the learning actually occurs. We find out how well our network predicted the outcome. We then move backward through the layers and updated the weights of the layers in order to improve our predictions. Nodes that are considered more important will end up having a higher weight value.

Building Blocks

Above is a very brief description of what is going in the neural network. Now we are going to look how to build it. The way I lik to think of implementing a neural network is to split it upon into several logical sections or building blocks. These are as follows:

Network Initialisation Forward Pass Backward Pass (Backpropagation) Update Weights Testing & Evaluation

1 — Initialisation

The first thing we need to do is initialise our network. Here we will determine the structure of the network including the number of nodes for each layer(input, hidden and output), the learning rate, the number of epochs and the initial random weights for each of our layers.

Input Nodes

This is the number of features for each entry in our data set, in this case it will be 30.

Hidden Nodes

This is the number of nodes in our hidden layer. For this project I use 12 hidden nodes. This is an arbitrary number that I decided upon through experimenting with different values.

Output Nodes

This is simply how many outputs our network will have. In this case we are carrying out binary classification so our network can either be 0 or 1. Due to this we will only use 1 node. If we were building a classifier with more than 2 types (such as classifying different types of animals) we would have several output nodes.

Learning Rate

The learning rate determines how much we adjust our weights during the back pass. A high number will let our network make progress quickly, but as the learning progresses we might adjust our weights too much, and increase the error as opposed to decreasing it. A low learning rate will gradually learn and be able to generalise better, but will take longer to converge and requires more training epochs. A learning rate that isnt too high or too low is recomended to begin with and then you can adjust the learning rates based on the needs of the network. For this example I decided on a learning rate of 0.1 which is generally a good starting point.

Epochs

The number of epochs determines how many times we will train our network on the data set.