DEFINITION

A simple neural network for solving a XOR function is a common task and is mostly required for our studies and other stuff . So , i have given some examples and some basic neural networks used to solve them more easily and there is a bonus program for you too .

SAMPLE

A network with one hidden layer containing two neurons should be enough to separate the XOR problem. Follow these steps :-

The first neuron acts as an OR gate and the second one as a NOT AND gate. Add both the neurons and if they pass the treshold it’s positive. You can just use linear decision neurons for this with adjusting the biases for the tresholds. The inputs of the NOT AND gate should be negative for the 0/1 inputs. This picture should make it more clear, the values on the connections are the weights, the values in the neurons are the biases, the decision functions act as 0/1 decisions (or just the sign function works in this case too).

NOTE-

If you are using basic gradient descent (with no other optimization, such as momentum), and a minimal network 2 inputs, 2 hidden neurons, 1 output neuron, then it is definitely possible to train it to learn XOR, but it can be quite tricky and unreliable.

NEURAL NETS USED(ADDITIONAL)

Back Propagation Solution Network

Because of the nature of the activation function, the activity on the output node can never reach either ‘0’ or ‘1’. We take values of less than 0.1 as equal to 0, and greater than 0.9 as equal to 1.

If the network seems to be stuck, it has hit what is called a ‘local minimum’. Keep your eye on the bias of the hidden node and wait. It will eventually head towards zero. As it approaches zero, the network will get out of the local minimum, and will shortly complete. This is because of a ‘momentum turn’ that is used in the calculation of the weights.

Conditional Back propagation Network

This network can learn any logical relationship expressible in a truth table of this sort. In the following, you can change the desired output, and train the network to produce that output.

Back propagation for Any Binary Logical Function

This network makes use of binary values and is used in less iterative steps. It’s most handy and is quicker in getting the solution.

MLP( Multi Layered Perceptron

This is a complex network with more basic nodes used and here, the feature value is set.

XOR solved by MLP ( Multi Layered Perceptron)

Feature detector — 2 ones