Package | Description |
---|---|
com.imsl.datamining.neural |
Neural networks.
|
Modifier and Type | Field and Description |
---|---|
static Activation |
Activation.LINEAR
The identity activation function, g(x) = x.
|
static Activation |
Activation.LOGISTIC
The logistic activation function, \(g(x)=\frac{1}{1+e^{-x}}
\).
|
static Activation |
Activation.LOGISTIC_TABLE
The logistic activation function computed using a table.
|
static Activation |
Activation.SOFTMAX
The softmax activation function.
|
static Activation |
Activation.SQUASH
The squash activation function,
\(g(x) = \frac{x}{1+|x|}\)
|
static Activation |
Activation.TANH
The hyperbolic tangent activation function, \(g(x)=\tanh{x}=
\frac{e^x-e^{-x}}{e^x+e^{-x}}\).
|
Modifier and Type | Method and Description |
---|---|
Activation |
Perceptron.getActivation()
Returns the activation function.
|
Modifier and Type | Method and Description |
---|---|
Perceptron |
HiddenLayer.createPerceptron(Activation activation,
double bias)
Creates a
Perceptron in this Layer with a
specified activation function and bias. |
Perceptron |
OutputLayer.createPerceptron(Activation activation,
double bias)
Creates a
Perceptron in this Layer with a
specified Activation and bias . |
Perceptron[] |
HiddenLayer.createPerceptrons(int n,
Activation activation,
double bias)
Creates a number of
Perceptron s in this Layer
with the specified bias. |
Perceptron[] |
OutputLayer.createPerceptrons(int n,
Activation activation,
double bias)
Creates a number of
Perceptron s in this Layer
with specified activation and bias . |
void |
Perceptron.setActivation(Activation activation)
Sets the activation function.
|
Copyright © 2020 Rogue Wave Software. All rights reserved.