Class FeedForwardNetwork
- All Implemented Interfaces:
Serializable
A Network contains an InputLayer, an
OutputLayer and zero or more HiddenLayers. The
null InputLayer and OutputLayer are
automatically created by the Network constructor. The
InputNodes are added using the
getInputLayer().createInputs(nInputs) method. Output Perceptrons
are added using the getOutputLayer().createPerceptrons(nOutputs), and HiddenLayers
can be created using the createHiddenLayer().createPerceptrons(nPerceptrons) method.
The InputLayer contains InputNodes. The
HiddenLayers and OutputLayers contain
Perceptron nodes. These Nodes are created
using factory methods in the Layers.
The Network also contains Links between
Nodes. Links are created by methods in this class.
Each Link has a weight and gradient value.
Each Perceptron node has a bias value. When the
Network is trained, the weight and bias values are
used as initial guesses. After the Network is trained the
weight, gradient and bias values are
set to the values computed by the training.
A feed forward network is a network in which links are only allowed from one layer to a following layer.
- See Also:
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionCreates aHiddenLayer.Returns theLinkbetween twoNodes.Link[]Returns all of theLinks to a givenNode.double[]forecast(double[] x) Computes a forecast using theNetwork.double[][]getForecastGradient(double[] xData) Returns the derivatives of the outputs with respect to the weights.Returns theHiddenLayers in this network.Returns theInputLayer.Link[]getLinks()Return all of theLinks in thisNetwork.intReturns the number of inputs to theNetwork.intReturns the number ofLinks in theNetwork.intReturns the number of outputs from theNetwork.intReturns the number of weights in theNetwork.Returns theOutputLayer.Returns thePerceptrons in thisNetwork.double[]Returns the weights for theLinks in this network.Establishes aLinkbetween twoNodes.Establishes aLinkbetween twoNodes with a specifiedweight.voidlinkAll()For eachLayerin theNetwork, link eachNodein theLayerto eachNodein the nextLayer.voidLink all of theNodes in oneLayerto all of theNodes in anotherLayer.voidRemoves aLinkfrom the network.voidsetEqualWeights(double[][] xData) Initializes network weights using equal weighting.voidsetRandomWeights(double[][] xData, Random random) Initializes network weights using random weights.voidsetWeights(double[] weights) Sets the weights for theLinks in thisNetwork.protected voidvalidateLink(Node from, Node to) Checks that aLinkbetween twoNodes is valid.Methods inherited from class com.imsl.datamining.neural.Network
computeStatistics
-
Constructor Details
-
FeedForwardNetwork
public FeedForwardNetwork()Creates a new instance ofFeedForwardNetwork.
-
-
Method Details
-
getInputLayer
Returns theInputLayer.- Specified by:
getInputLayerin classNetwork- Returns:
- The neural network
InputLayer.
-
getOutputLayer
Returns theOutputLayer.- Specified by:
getOutputLayerin classNetwork- Returns:
- The neural network
OutputLayer.
-
createHiddenLayer
Creates aHiddenLayer.- Specified by:
createHiddenLayerin classNetwork- Returns:
- A
HiddenLayerobject which specifies a neural network hidden layer.
-
getHiddenLayers
Returns theHiddenLayers in this network.- Returns:
- An array of
HiddenLayers in this network.
-
link
Establishes aLinkbetween twoNodes. Any existingLinkbetween theseNodes is removed.- Parameters:
from- The originationNode.to- The destinationNode.- Returns:
- A
Linkbetween the twoNodes.
-
link
Establishes aLinkbetween twoNodes with a specifiedweight.- Parameters:
from- The originationNode.to- The destinationNode.weight- Adoublewhich specifies the weight to be given theLink.- Returns:
- A
Linkbetween the twoNodes.
-
linkAll
Link all of theNodes in oneLayerto all of theNodes in anotherLayer.- Parameters:
from- The originationLayer.to- The destinationLayer.
-
linkAll
public void linkAll()For eachLayerin theNetwork, link eachNodein theLayerto eachNodein the nextLayer. -
getLinks
Return all of theLinks in thisNetwork. -
findLinks
Returns all of theLinks to a givenNode.- Parameters:
to- ANodewho'sLinks are to be determined.- Returns:
- An array of
Links containing all of theLinks to the givenNode.
-
findLink
Returns theLinkbetween twoNodes.- Parameters:
from- The originationNode.to- The destinationNode.- Returns:
- A
Linkbetween the twoNodes, ornullif no suchLinkexists.
-
remove
Removes aLinkfrom the network.- Parameters:
link- TheLinkdeleted from the network.
-
getNumberOfInputs
public int getNumberOfInputs()Returns the number of inputs to theNetwork.- Specified by:
getNumberOfInputsin classNetwork- Returns:
- An
intcontaining the number of inputs to theNetwork.
-
getNumberOfOutputs
public int getNumberOfOutputs()Returns the number of outputs from theNetwork.- Specified by:
getNumberOfOutputsin classNetwork- Returns:
- An
intcontaining the number of outputs from theNetwork.
-
getNumberOfLinks
public int getNumberOfLinks()Returns the number ofLinks in theNetwork.- Specified by:
getNumberOfLinksin classNetwork- Returns:
- An
intwhich contains the number ofLinks in theNetwork.
-
getPerceptrons
Returns thePerceptrons in thisNetwork.- Specified by:
getPerceptronsin classNetwork- Returns:
- An array of
Perceptrons in this network.
-
getWeights
public double[] getWeights()Returns the weights for theLinks in this network.- Specified by:
getWeightsin classNetwork- Returns:
- An array of
doubles containing the weights. The array contains the weights for eachLinkfollowed by thePerceptronbias values. TheLinkweights are in the order in which theLinkswere created. The weight values are first, followed by the bias values in theHiddenLayersand then the bias values in theOutputLayer, and then by the order in which thePerceptrons were created.
-
setWeights
public void setWeights(double[] weights) Sets the weights for theLinks in thisNetwork.- Specified by:
setWeightsin classNetwork- Parameters:
weights- Adoublearray containing the weights in the same order asgetWeights().
-
setRandomWeights
Initializes network weights using random weights.The
RandomWeightsalgorithm assigns equal weights to allPerceptrons, except those in the firstLayerconnected to theInputLayer. Like theEqualWeightsalgorithm,Perceptrons not in the firstLayerare assigned weights 1/k, where k is the number of inputs connected to thatPerceptron.For the first
Layer Perceptronweights, they are initially assigned values from the uniform random distribution on the interval [-0.5, +0.5]. These are then scaled using the training patterns. The random weights for aPerceptronare divided by s, the standard deviation of the potential for thatPerceptroncalculated using the intial random values. Its bias value is set to -avg/s, where avg is the average potential for thatPerceptron</code. This makes the average potential for thePerceptrons in this firstLayerapproximately 0 and its standard deviation equal to 1.This reduces the possibility of saturation during network training resulting from very large or small values for the
Perceptron's potential. During training random noise is added to these intial values at each training stage. If theEpochTraineris used, noise is added to these initial values at the start of each epoch.- Parameters:
xData- An inputdoublematrix containing training patterns. The number of columns inxDatamust equal the number ofPerceptronsin theInputLayer.random- ARandomobject.
-
setEqualWeights
public void setEqualWeights(double[][] xData) Initializes network weights using equal weighting.The equal weights approach starts by assigning equal values to the inputs of each
Perceptron. If aPerceptronhas 4 inputs, then this method starts by assigning the value 1/4 to each of theperceptron's input weights. The bias weight is initially assigned a value of zero.The weights for the first
LayerofPerceptrons, either the firstHiddenLayerif the number ofLayers is greater than 1 or theOutputLayer, are scaled using the training patterns. Scaling is accomplished by dividing the initial weights for the firstLayerby the standard deviation, s, of the potential for thatPerceptron. The bias weight is set to -avg/s, where avg is the average potential for thatPerceptron. This makes the average potential for thePerceptrons in this firstLayerapproximately 0 and its standard deviation equal to 1.This reduces the possibility of saturation during network training resulting from very large or small values for the
Perceptron's potential. During training random noise is added to these intial values at each training stage. If theEpochTraineris used, noise is added to these initial values at the start of each epoch.- Parameters:
xData- An inputdoublematrix containing training patterns. The number of columns inxDatamust equal the number ofPerceptrons in theInputLayer.
-
getNumberOfWeights
public int getNumberOfWeights()Returns the number of weights in theNetwork.- Specified by:
getNumberOfWeightsin classNetwork- Returns:
- An
intwhich contains the number of weights in theNetwork.
-
validateLink
Checks that aLinkbetween twoNodes is valid.In a
FeedForwardNetworkaLinkmust be from a node in oneLayerto aNodein a laterLayer. IntermediateLayers can be skipped, but aLinkcannot go backward.- Parameters:
from- The originationNode.to- The destinationNode.- Throws:
IllegalArgumentException- is thrown if theLinkis not valid
-
forecast
public double[] forecast(double[] x) Computes a forecast using theNetwork. -
getForecastGradient
public double[][] getForecastGradient(double[] xData) Returns the derivatives of the outputs with respect to the weights.- Specified by:
getForecastGradientin classNetwork- Parameters:
xData- Adoublearray which specifies the input values at which the gradient is to be evaluated.- Returns:
- A
doublearray containing the gradient values. The value ofgradient[i][j]is \(dy_i/dw_j\), where \(y_i\) is the i-th output and \(w_j\) is the j-th weight.
-