Interface Activation

All Superinterfaces:
Serializable

public interface Activation extends Serializable
Interface implemented by perceptron activation functions.

Standard activation functions are defined as static members of this interface. New activation functions can be defined by implementing a method, g(double x), returning the value and a method, derivative(double x, double y), returning the derivative of g evaluated at x where y = g(x).

See Also:
  • Field Summary

    Fields
    Modifier and Type
    Field
    Description
    static final Activation
    The identity activation function, g(x) = x.
    static final Activation
    The logistic activation function, \(g(x)=\frac{1}{1+e^{-x}} \).
    static final Activation
    The logistic activation function computed using a table.
    static final Activation
    The softmax activation function.
    static final Activation
    The squash activation function, \(g(x) = \frac{x}{1+|x|}\)
    static final Activation
    The hyperbolic tangent activation function, \(g(x)=\tanh{x}= \frac{e^x-e^{-x}}{e^x+e^{-x}}\).
  • Method Summary

    Modifier and Type
    Method
    Description
    double
    derivative(double x, double y)
    Returns the value of the derivative of the activation function.
    double
    g(double x)
    Returns the value of the activation function.
  • Field Details

    • LINEAR

      static final Activation LINEAR
      The identity activation function, g(x) = x.
    • LOGISTIC

      static final Activation LOGISTIC
      The logistic activation function, \(g(x)=\frac{1}{1+e^{-x}} \).
    • LOGISTIC_TABLE

      static final Activation LOGISTIC_TABLE
      The logistic activation function computed using a table. This is an approximation to the logistic function that is faster to compute.

      This version of the logistic function differs from the exact version by at most 4.0e-9.

      Networks trained using this activation should not use Activation.LOGISTIC for forecasting. Forecasting should be done using the specific function supplied during training.

    • TANH

      static final Activation TANH
      The hyperbolic tangent activation function, \(g(x)=\tanh{x}= \frac{e^x-e^{-x}}{e^x+e^{-x}}\).
    • SQUASH

      static final Activation SQUASH
      The squash activation function, \(g(x) = \frac{x}{1+|x|}\)
    • SOFTMAX

      static final Activation SOFTMAX
      The softmax activation function. $$ {\rm{softmax}}_{\rm{i}}=\frac{{{\mathop{\rm e}\nolimits} ^{Z_i } }} {{\sum\limits_{j = 1}^C {e^{Z_j } } }}$$.
  • Method Details

    • g

      double g(double x)
      Returns the value of the activation function.
      Parameters:
      x - A double is the point at which the activation function is to be evaluated.
      Returns:
      A double containing the value of the activation function at x.
    • derivative

      double derivative(double x, double y)
      Returns the value of the derivative of the activation function.
      Parameters:
      x - A double which specifies the point at which the activation function is to be evaluated.
      y - A double which specifies y = g(x), the value of the activation function at x. This parameter is not mathematically required, but can sometimes be used to more quickly compute the derivative.
      Returns:
      A double containing the value of the derivative of the activation function at x.