public interface Activation extends Serializable
Standard activation functions are defined as static members of this
interface. New activation functions can be defined by implementing a method,
g(double x)
, returning the value and a method,
derivative(double x, double y)
, returning the derivative of g
evaluated at x where y = g(x).
Modifier and Type | Field and Description |
---|---|
static Activation |
LINEAR
The identity activation function, g(x) = x.
|
static Activation |
LOGISTIC
The logistic activation function, \(g(x)=\frac{1}{1+e^{-x}}
\).
|
static Activation |
LOGISTIC_TABLE
The logistic activation function computed using a table.
|
static Activation |
SOFTMAX
The softmax activation function.
|
static Activation |
SQUASH
The squash activation function,
\(g(x) = \frac{x}{1+|x|}\)
|
static Activation |
TANH
The hyperbolic tangent activation function, \(g(x)=\tanh{x}=
\frac{e^x-e^{-x}}{e^x+e^{-x}}\).
|
Modifier and Type | Method and Description |
---|---|
double |
derivative(double x,
double y)
Returns the value of the derivative of the activation function.
|
double |
g(double x)
Returns the value of the activation function.
|
static final Activation LINEAR
static final Activation LOGISTIC
static final Activation LOGISTIC_TABLE
This version of the logistic function differs from the exact version by at most 4.0e-9.
Networks trained using this activation should not use
Activation.LOGISTIC
for forecasting. Forecasting should be done
using the specific function supplied during training.
static final Activation TANH
static final Activation SQUASH
static final Activation SOFTMAX
double g(double x)
x
- A double
is the point at which the activation
function is to be evaluated.double
containing the value of the activation
function at x
.double derivative(double x, double y)
x
- A double
which specifies the point at which the
activation function is to be evaluated.y
- A double
which specifies y = g(x)
, the
value of the activation function at x
. This parameter
is not mathematically required, but can sometimes be used to
more quickly compute the derivative.double
containing the value of the derivative of
the activation function at x
.Copyright © 2020 Rogue Wave Software. All rights reserved.