Enum Class GradientBoosting.LossFunctionType

java.lang.Object
java.lang.Enum<GradientBoosting.LossFunctionType>
com.imsl.datamining.GradientBoosting.LossFunctionType
All Implemented Interfaces:
Serializable, Comparable<GradientBoosting.LossFunctionType>, java.lang.constant.Constable
Enclosing class:
GradientBoosting

public static enum GradientBoosting.LossFunctionType extends Enum<GradientBoosting.LossFunctionType>
The loss function type as specified by the error measure.
  • Enum Constant Details

    • LEAST_SQUARES

      public static final GradientBoosting.LossFunctionType LEAST_SQUARES
      The loss criteria is least squared error. That is, the loss function

      \( L({y_i,f(x_i)}) = \sum (y_i - f(x_i))^2 \)

    • LEAST_ABSOLUTE_DEVIATION

      public static final GradientBoosting.LossFunctionType LEAST_ABSOLUTE_DEVIATION
      The loss criteria is least absolute deviation error. That is, the loss function

      \( L({y_i,f(x_i)}) = \sum |(y_i - f(x_i))| \)

    • HUBER_M

      public static final GradientBoosting.LossFunctionType HUBER_M
      The loss criteria is the Huber-M weighted squared error and absolute deviation error with parameter \(\alpha\). That is, the loss function

      \( L({y_i,f(x_i)}) = \sum \Psi(y_i,f(x_i)) \)

      where

      $$ \Psi(y,f(x)) = \left\{ \begin{array}{ll} (y-f(x))^2 & {\rm for}\;|y-f(x)| \le \delta\; \\ 2\delta(|y-f(x)| - \delta) & {\rm for}\;|y-f(x)| > \delta\; \end{array} \right.$$

      And where \(\delta\) is the \(\alpha\) -empirical quantile of the residuals, \({y_i-f(x_i),i=1,...,n}\)

    • ADABOOST

      public static final GradientBoosting.LossFunctionType ADABOOST
      The loss criteria is the AdaBoost.M1 criterion. The loss function

      \( L({y_i,f(x_i)}) = \sum \exp\left( -(2y_i-1)f(x_i)\right) \)

    • MULTINOMIAL_DEVIANCE

      public static final GradientBoosting.LossFunctionType MULTINOMIAL_DEVIANCE
      The loss criteria is the (K-class) multinomial negative log-likelihood, or multinomial deviance. The loss function

      \(L({y_i,f(x_i)})=-2 \sum_i \sum_k y_{ik}\log(p_{ik})\)

      where

      \(p_{ik}=\frac{\exp{f_k(x_i)}}{(\sum_k \exp{f_k(x_i)})}\)

    • BERNOULLI

      public static final GradientBoosting.LossFunctionType BERNOULLI
      The loss criteria is the binomial or Bernoulli negative log-likelihood, or deviance.

      \( L({y_i,f(x_i)}) = -2 \sum {(y_if(x_i) - \log(1+\exp(f(x_i)))}\)

  • Method Details

    • values

      public static GradientBoosting.LossFunctionType[] values()
      Returns an array containing the constants of this enum class, in the order they are declared.
      Returns:
      an array containing the constants of this enum class, in the order they are declared
    • valueOf

      public static GradientBoosting.LossFunctionType valueOf(String name)
      Returns the enum constant of this class with the specified name. The string must match exactly an identifier used to declare an enum constant in this class. (Extraneous whitespace characters are not permitted.)
      Parameters:
      name - the name of the enum constant to be returned.
      Returns:
      the enum constant with the specified name
      Throws:
      IllegalArgumentException - if this enum class has no constant with the specified name
      NullPointerException - if the argument is null