Package Bio :: Package NeuralNetwork :: Package BackPropagation :: Module Network
[hide private]
[frames] | no frames]

Source Code for Module Bio.NeuralNetwork.BackPropagation.Network

  1  """Represent Neural Networks. 
  2   
  3  This module contains classes to represent Generic Neural Networks that 
  4  can be trained. 
  5   
  6  Many of the ideas in this and other modules were taken from 
  7  Neil Schemenauer's bpnn.py, available from: 
  8   
  9  http://www.enme.ucalgary.ca/~nascheme/python/bpnn.py 
 10   
 11  My sincerest thanks to him for making this available for me to work from, 
 12  and my apologies for anything I mangled. 
 13  """ 
 14  # standard library 
 15  import math 
 16   
17 -class BasicNetwork:
18 """Represent a Basic Neural Network with three layers. 19 20 This deals with a Neural Network containing three layers: 21 22 o Input Layer 23 24 o Hidden Layer 25 26 o Output Layer 27 """
28 - def __init__(self, input_layer, hidden_layer, output_layer):
29 """Initialize the network with the three layers. 30 """ 31 self._input = input_layer 32 self._hidden = hidden_layer 33 self._output = output_layer
34
35 - def train(self, training_examples, validation_examples, 36 stopping_criteria, learning_rate, momentum):
37 """Train the neural network to recognize particular examples. 38 39 Arguments: 40 41 o training_examples -- A list of TrainingExample classes that will 42 be used to train the network. 43 44 o validation_examples -- A list of TrainingExample classes that 45 are used to validate the network as it is trained. These examples 46 are not used to train so the provide an independent method of 47 checking how the training is doing. Normally, when the error 48 from these examples starts to rise, then it's time to stop 49 training. 50 51 o stopping_criteria -- A function, that when passed the number of 52 iterations, the training error, and the validation error, will 53 determine when to stop learning. 54 55 o learning_rate -- The learning rate of the neural network. 56 57 o momentum -- The momentum of the NN, which describes how much 58 of the prevoious weight change to use. 59 """ 60 num_iterations = 0 61 while 1: 62 num_iterations += 1 63 training_error = 0.0 64 for example in training_examples: 65 # update the predicted values for all of the nodes 66 # based on the current weights and the inputs 67 # This propogates over the entire network from the input. 68 self._input.update(example.inputs) 69 70 # calculate the error via back propogation 71 self._input.backpropagate(example.outputs, 72 learning_rate, momentum) 73 74 # get the errors in our predictions 75 for node in range(len(example.outputs)): 76 training_error += \ 77 self._output.get_error(example.outputs[node], 78 node + 1) 79 80 # get the current testing error for the validation examples 81 validation_error = 0.0 82 for example in validation_examples: 83 predictions = self.predict(example.inputs) 84 85 for prediction_num in range(len(predictions)): 86 real_value = example.outputs[prediction_num] 87 predicted_value = predictions[prediction_num] 88 validation_error += \ 89 0.5 * math.pow((real_value - predicted_value), 2) 90 91 # see if we have gone far enough to stop 92 if stopping_criteria(num_iterations, training_error, 93 validation_error): 94 break
95
96 - def predict(self, inputs):
97 """Predict outputs from the neural network with the given inputs. 98 99 This uses the current neural network to predict outputs, no 100 training of the neural network is done here. 101 """ 102 # update the predicted values for these inputs 103 self._input.update(inputs) 104 105 output_keys = self._output.values.keys() 106 output_keys.sort() 107 108 outputs = [] 109 for output_key in output_keys: 110 outputs.append(self._output.values[output_key]) 111 return outputs
112