preview

Nt1310 Unit 4 Paper

Satisfactory Essays

Output k = (4.10) Where xi is the input i and wik is the weight connecting input i to neuron k, set all weights to small random values, positive and negative, usually in the ranges from - 1 to +1. Apply one training sample to the input layer, that is x0 .... xN −1 and note the corresponding desired output vector, i.e.,. y0 ....yM −1. X1k = for each neuron k, 0 ≤ k ≤ P-1 (4.11) Equation (3.11) has to be compared with the actual outputs from the output layer. Calculate the outputs from the first layer. Where P is the number of neurons in the first hidden layer. The outputs x1k are fed to the next hidden layer. Xk2 = for each neuron k, 0 ≤ k ≤ Q-1 (4.12) Where Q is the number of neurons in the second hidden layer. These outputs are fed to the output layer …show more content…

The bias weight is updated in the same way as the other weights using δ0k with the input xi always at 1(Sivanndam et al, 2011) and (Kevin et al, 2007) The following steps are used to design the back propagation neural network algorithm for the proposed research work. The first step is to set the input, output data sets. The second step is to set the number of hidden layer and output activation functions. The third step is to set the training functions and training parameters, finally run the network. The neural network is composed of an input of 12 neurons presenting in the X values, an output layer with 2 neurons presenting in the Y values and hidden layer with 10

Get Access