gradient_boosting

Performs stochastic gradient boosting of decision trees.

Synopsis

#include <imsls.h>

float *imsls_f_gradient_boosting (int n_rows, int n_cols, float xy[], int response_col_idx, int var_type[], …, 0)

The type double function is imsls_d_gradient_boosting.

Required Arguments

int n_rows (Input)
The number of rows in xy.

int n_cols (Input)
The number of columns in xy.

float xy[] (Input)
Array of size n_rows × n_cols containing the data.

int response_col_idx (Input)
The column index of xy containing the response variable.

int var_type[] (Input)
Array of length n_cols indicating the type of each variable.

var_type[i]

Description

0

Categorical

1

Ordered Discrete (e.g., Low, Med, High)

2

Quantitative or Continuous

3

Ignore this variable

 

Note: When the variable type is specified as Categorical (var_type[i] = 0), the numbering of the categories must begin at 0. For example, if there are three categories, they must be represented as 0, 1, and 2 in the xy array.

The number of classes for a categorical response variable is determined by the largest value discovered in the data. Note that a warning message is displayed if a class level in 0, 1, , n_classes - 1 has a 0 count in the data.

Return Value

A pointer to an array of predicted values on the test data if test data is provided (see optional argument, IMSLS_TEST_DATA). If test data is not provided, the predicted values are the fitted values on the training data. If an error occurs, NULL is returned.

Synopsis with Optional Arguments

#include <imsls.h>

float *imsls_f_gradient_boosting (int n_rows, int n_cols, float xy[], int response_col_idx, int var_type[],

IMSLS_TEST_DATA, int n_test, float xy_test[],

IMSLS_TEST_DATA_WEIGHTS, float weights_test[],

IMSLS_WEIGHTS, float weights[],

IMSLS_N_SAMPLE, int sample_size,

IMSLS_SAMPLE_PROPORTION, float sample_p,

IMSLS_SHRINKAGE, float shrinkage,

IMSLS_MAX_ITER, int max_iter,

IMSLS_LOSS_FCN, int loss_fcn_type,

IMSLS_ALPHA, float huber_alpha,

IMSLS_CONTROL, int params[],

IMSLS_RANDOM_SEED, int seed,

IMSLS_PRINT, int print_level,

IMSLS_LOSS_VALUE, float *loss_value,

IMSLS_TEST_LOSS_VALUE, float *test_loss_value,

IMSLS_FITTED_VALUES, float **fitted_values,

IMSLS_FITTED_VALUES_USER, float fitted_values[],

IMSLS_PROBABILITIES, float **probs,

IMSLS_PROBABILITIES_USER, float probs[],

IMSLS_FITTED_PROBABILITIES, float **fitted_probs,

IMSLS_FITTED_PROBABILITIES_USER, float fitted_probs[],

IMSLS_RETURN_TREES, Imsls_f_decision_tree ***boosted_trees,

IMSLS_RETURN_MODEL, Imsls_f_gradient_boosting_model **gb_model,

IMSLS_RETURN_USER, float predictions[],

0)

Optional Arguments

IMSLS_TEST_DATA, int n_test, float xy_test[] (Input)
xy_test is an array of size n_test × n_cols containing test data for which predictions are requested. When this optional argument is present, the number of observations n_test must be greater than 0. The response variable may have missing values in xy_test, but it must be in the same column as it is in xy and the predictors must be in the same columns as they are in xy. If the test data is not provided but predictions are requested, then xy is used, and the predictions are the fitted values.
Default: n_test = n_rows, xy_test = xy.

IMSLS_TEST_DATA_WEIGHTS, float weights_test[] (Input)
An array of size n_test containing the frequencies or weights for each observation in xy_test. This argument is ignored if IMSLS_TEST_DATA is not present.
Default: weights_test[i] = 1.0.

IMSLS_WEIGHTS, float weights[] (Input)
An array of length n_rows containing frequencies or weights for each observation in xy.
Default: weights[i] = 1.0.

IMSLS_N_SAMPLE, int sample_size (Input)
The number of examples to be drawn randomly from the training data in each iteration.
Default: sample_size = sample_p*n_rows.

IMSLS_SAMPLE_PROPORTION, float sample_p (Input)
The proportion of the training examples to be drawn randomly from the training data in each iteration.
Default: sample_p = 0.5.

IMSLS_SHRINKAGE, float shrinkage (Input)
The shrinkage parameter to be used in the boosting algorithm. The parameter must be in the interval [0,1] inclusive.
Default: shrinkage = 1.0 (no shrinkage).

IMSLS_MAX_ITER, int max_iter (Input)
The number of iterations. This value is equivalent to M in the boosting algorithm described below.
Default: max_iter = 50.

IMSLS_LOSS_FCN, int loss_fcn_type (Input)
An integer specifying the loss function to use in the algorithm for regression problems (loss_fcn_type = 0, 1, 2) or binary classification problems (loss_fcn_type = 3, 4). See the Description section for the loss function in the multinomial case (categorical response variables with more than two outcomes).
Default: loss_fcn_type = 0.

 

Name

loss_fcn_type

Definition

Least Squares

0

The loss function is the sum of squared error:

 

Least Absolute Deviation

1

The loss function is the sum of absolute errors:

 

Huber M

2

The loss function is the weighted mixture of squared error and absolute error:

 

where

 

and where δ is the α empirical quantile of the errors, .

Adaboost

3

The loss function is the AdaBoost.M1 criterion:

 

Bernoulli or binomial deviance

4

The loss function is the binomial or Bernoulli negative log-likelihood:

 

IMSLS_ALPHA, float huber_alpha (Input)
The quantile value for the Huber-M loss function.
Default: huber_alpha = 0.05.

IMSLS_CONTROL, int params[] (Input)
Array of length 5 containing parameters to control the size and other characteristics of the decision trees.

params[i]

Name

Action

0

min_n_node

Do not split a node if one of its child nodes will have fewer than min_n_node observations.

1

min_split

Do not split a node if the node has fewer than min_split observations.

2

max_x_cats

Allow for up to max_x_cats number of categories or levels for categorical variables.

3

max_size

Stop growing the tree once it has reached max_size number of nodes.

4

max_depth

Stop growing the tree once it has reached max_depth number of levels.

Default: params[] = {10, 21, 10, 4, 10}.

IMSLS_RANDOM_SEED, int seed (Input)
Sets the seed of the random number generator used in sampling. Using the same seed in repeated calls will result in the same output. If seed = 0, the random seed is set by the system clock and repeated calls result in slightly different results.
Default: seed = 0.

IMSLS_PRINT, int print_level (Input)

print_level

Action

0

No printing

1

Print final results only

2

Print intermediate and final results

Default: print_level = 0.

IMSLS_LOSS_VALUE, float *loss_value (Output)
The final value of the loss function after M iterations of the algorithm.

IMSLS_TEST_LOSS_VALUE, float *test_loss_value (Output)
The final value of the loss function after M iterations of the algorithm on the test data.

IMSLS_FITTED_VALUES, float **fitted_values (Output)
Address of a pointer to an array of length n_rows containing the fitted values on the training data xy after M iterations of the algorithm.

IMSLS_FITTED_VALUES_USER, float fitted_values[] (Output)
Storage for the array of the fitted values for the training data is provided by the user.

IMSLS_PROBABILITIES, float **probs (Output)
Address of a pointer to an array of length n_rows×n_classes containing the predicted class probabilities for each observation in the test data.

IMSLS_PROBABILITIES_USER, float probs[] (Output)
Storage for the array of the predicted class probabilities is provided by the user.

IMSLS_FITTED_PROBABILITIES, float **fitted_probabilities (Output)
Address of a pointer to an array of length n_rows×n_classes containing the fitted class probabilities on the training data for classification problems.

IMSLS_FITTED_PROBABILITIES_USER, float fitted_probabilities[] (Output)
Storage for the array of the fitted class probabilities is provided by the user.

IMSLS_RETURN_TREES, Imsls_f_decision_tree ***boosted_trees (Output)
Address of a pointer to an array of length M containing the collection of trees generated during the algorithm. To release this space, use imsls_f_bagged_trees_free.

IMSLS_RETURN_MODEL, Imsls_f_gradient_boosting_model **gb_model (Output)
Address of a pointer to structure containing the trained or fitted gradient boosting model. Elements include the sequence of boosted trees and other parameters used in the algorithm. To release this space, use imsls_f_gradient_boosting_model_free.

IMSLS_RETURN_USER, float predictions[] (Output)
Storage for the array of predicted values is provided by the user.

Description

Stochastic gradient boosting is an optimization algorithm for minimizing residual errors to improve the accuracy of predictions. This function implements the algorithm by Friedman (1999). For further discussion, see Hastie, et al. (2009).

In the following, xi is the vector of predictor variable values, and yi is the response variable value in the observation at row i. The function fm(xi) evaluated at xi is the predicted value in a particular iteration, m. This value is iteratively reweighted to minimize a loss function. Specifically, the algorithm is:

Initialize the predictor function to the constant

 

For each iteration ,

1. Calculate the pseudo-residuals

 

2.   Fit a regression tree to the pseudo-residuals rim and use the resulting models to predict the observations in the training data. The resulting terminal nodes define Jm terminal regions Rjm for the response. Compute

 

3.   Update the prediction function for each observation, xi,

 

where λ[0,1] is a shrinkage parameter (λ = 1 means no shrinking, whereas λ = 0 gives just fM = f0 ).

After M iterations, the function fM() forms the basis of the predictions for the response variable.

Specifically

Response variable type

Definition

QUANTITATIVE_CONTINUOUS

For the regression problem, the predicted value at a new observation vector xi is

 

CATEGORICAL with 2 outcomes (binomial)

For a classification problem with 2 outcomes, the predicted probability is

 

Then the predicted value is

where I{} is the indicator function.

CATEGORICAL with 3 or more outcomes (multinomial)

For a classification problem with K  3 outcomes, the predicted probabilities for k = 1,,K are

 

Then the predicted value is

 

 

For regression problems, the algorithm uses the squared error loss by default. For classification problems with two categories, the Bernoulli or binomial loss function is the default (see optional argument IMSLS_LOSS_FCN). For a categorical response with three or more categories, the multinomial deviance (described below) is used.

For a categorical response with K categories, the loss function is the multinomial negative log-likelihood, or multinomial deviance:

where

 

Examples

Example 1

This example uses stochastic gradient boosting to obtain fitted values for a regression variable on a small data set with six predictor variables.

 

#include <imsls.h>

#include <stdio.h>

 

#define ROW 61

#define COL 7

 

int main(){

 

float XY[ROW][COL] = {

{ 4.45617685, 0.8587425048, 1.2705688183, 0.0, 0.0, 1.0, 0.836626959 },

{ 3.01895357, 0.8928761308, 1.3886538362, 2.0, 1.0, 2.0, 2.155131825 },

{ 5.16899757, 0.7385954093, 1.5773203815, 0.0, 4.0, 2.0, 0.075368922 },

{ -0.23062048, 0.6227398487, 0.0228797458, 3.0, 4.0, 2.0, 0.070793233 },

{ 2.43144968, 0.8519553537, 1.2141886768, 2.0, 4.0, 2.0, 0.762200702 },

{ 2.28255119, 0.5578103897, 0.9185446175, 2.0, 4.0, 2.0, 0.085492814 },

{ 4.51650903, 0.4178302658, 1.3686663737, 0.0, 0.0, 0.0, 2.573941051 },

{ 5.42996967, 0.9829705667, 0.7817731784, 0.0, 5.0, 1.0, 0.865016054 },

{ 0.99551212, 0.3859238869, 0.2746516233, 3.0, 4.0, 0.0, 1.908151819 },

{ 1.23525017, 0.4165328839, 1.3154437956, 3.0, 4.0, 2.0, 2.752358041 },

{ 1.51599306, 0.2008399745, 0.9003028921, 3.0, 0.0, 2.0, 1.437127559 },

{ 2.72854297, 0.2072261081, 1.2282209327, 2.0, 5.0, 2.0, 0.68596562 },

{ 3.06956138, 0.9067490781, 0.8283077031, 2.0, 0.0, 2.0, 2.862403627 },

{ 1.81659279, 0.4506153886, 1.2822537781, 3.0, 4.0, 2.0, 1.710525684 },

{ 3.75978142, 0.2638894715, 0.4995447062, 0.0, 1.0, 1.0, 1.077172402 },

{ 5.72383445, 0.7682430062, 1.4758595745, 0.0, 3.0, 1.0, 2.365233736 },

{ 3.78155015, 0.6888140934, 0.4809393724, 0.0, 0.0, 1.0, 1.061246069 },

{ 3.60023233, 0.8470419827, 1.6149122352, 1.0, 1.0, 0.0, 0.01120048 },

{ 4.30238917, 0.9484412405, 1.6122899544, 1.0, 4.0, 2.0, 0.782038861 },

{ -0.19206757, 0.7674867723, 0.01665624, 3.0, 5.0, 2.0, 2.924944949 },

{ 3.03246318, 0.8747456241, 1.6051767552, 2.0, 1.0, 0.0, 2.233971364 },

{ 1.56652306, 0.0947128241, 1.470864601, 3.0, 0.0, 1.0, 1.851705944 },

{ 2.77490671, 0.1347932827, 1.3693161067, 1.0, 2.0, 0.0, 0.795709459 },

{ 1.05042043, 0.258093959, 0.4679728113, 3.0, 5.0, 0.0, 2.897785557 },

{ 2.73366469, 0.152943752, 0.5244769375, 1.0, 4.0, 2.0, 2.712871963 },

{ 1.78996951, 0.7921472492, 0.4686144991, 2.0, 4.0, 1.0, 1.295327727 },

{ 1.10343272, 0.123231777, 0.563989053, 2.0, 4.0, 1.0, 0.510414582 },

{ 1.70883743, 0.1931027549, 1.8561577178, 3.0, 5.0, 1.0, 0.165721288 },

{ 2.17977731, 0.316932481, 1.3376214528, 2.0, 2.0, 0.0, 2.366607214 },

{ 2.46127675, 0.9601344266, 0.2090187217, 1.0, 3.0, 1.0, 0.846218965 },

{ 1.92249547, 0.1104206559, 1.739415036, 3.0, 0.0, 0.0, 0.652622544 },

{ 5.81907137, 0.7049566596, 1.6238740934, 0.0, 3.0, 0.0, 1.685337845 },

{ 2.04774497, 0.0480224835, 0.7510998738, 2.0, 5.0, 2.0, 1.400641323 },

{ 4.54023907, 0.0557708007, 1.0864350675, 0.0, 1.0, 1.0, 1.630408823 },

{ 3.66100874, 0.2939440177, 0.9709178614, 0.0, 1.0, 0.0, 0.06970193 },

{ 4.39253655, 0.0982369843, 1.2492676578, 0.0, 2.0, 2.0, 0.138188998 },

{ 3.23303353, 0.3775206071, 0.2937129182, 0.0, 0.0, 2.0, 1.070823081 },

{ 3.13800098, 0.7891691434, 1.90897633, 2.0, 3.0, 0.0, 1.240732062 },

{ 1.49034639, 0.2456938969, 0.9157859818, 3.0, 5.0, 0.0, 0.850803277 },

{ 0.09486277, 0.1240615626, 0.3891524528, 3.0, 5.0, 0.0, 2.532516038 },

{ 3.74460501, 0.0181218453, 1.4921644945, 1.0, 2.0, 1.0, 1.92839241 },

{ 3.24158796, 0.9203409508, 1.1644667462, 2.0, 3.0, 1.0, 1.956283022 },

{ 1.97796767, 0.5977597698, 0.5501609747, 2.0, 5.0, 2.0, 0.39384095 },

{ 4.15214037, 0.1433333508, 1.4292114358, 1.0, 0.0, 0.0, 1.114095218 },

{ 0.7799787, 0.8539819908, 0.7039108537, 3.0, 0.0, 1.0, 1.468978726 },

{ 2.01869009, 0.8919721926, 1.1436212659, 3.0, 4.0, 1.0, 2.09256257 },

{ 0.56311561, 0.0899261576, 0.7989077698, 3.0, 5.0, 0.0, 0.195650739 },

{ 4.74296429, 0.9625684835, 1.5732420743, 0.0, 3.0, 2.0, 2.685061853 },

{ 2.97981809, 0.5511086562, 1.6053283028, 2.0, 5.0, 2.0, 0.906810926 },

{ 2.82187135, 0.3869563073, 0.9321342241, 1.0, 5.0, 1.0, 0.756223386 },

{ 5.24390592, 0.3500950718, 1.7769328682, 0.0, 3.0, 2.0, 1.328165314 },

{ 3.17307157, 0.8798056154, 1.4647966106, 2.0, 5.0, 1.0, 0.561835038 },

{ 0.78246075, 0.1472158518, 0.4658273738, 2.0, 0.0, 0.0, 1.317240539 },

{ 1.57827027, 0.3415432149, 0.7513634153, 2.0, 2.0, 0.0, 1.502675544 },

{ 0.84104905, 0.1501226462, 0.9332020828, 3.0, 1.0, 2.0, 1.083374695 },

{ 2.63627352, 0.1707233109, 1.1676406977, 2.0, 3.0, 0.0, 2.236639737 },

{ 1.30863625, 0.2616807753, 0.8342161868, 3.0, 2.0, 2.0, 1.778402721 },

{ 2.7313073, 0.9616109401, 1.596915911, 3.0, 3.0, 1.0, 0.303127344 },

{ 3.56848173, 0.4072918599, 1.5345127448, 1.0, 2.0, 2.0, 1.47452504 },

{ 5.40152982, 0.7796053565, 1.3659530994, 0.0, 4.0, 1.0, 0.484531098 },

{ 3.94901823, 0.5052344366, 1.9319026601, 1.0, 2.0, 0.0, 2.504392843 },

};

int i;

int response_col_idx = 0;

int var_type[] = { 2, 2, 2, 0, 0, 0, 2 };

float *fitted_values = NULL;

float loss_value;

 

fitted_values = imsls_f_gradient_boosting(ROW, COL, &XY[0][0], response_col_idx,

var_type,

IMSLS_RANDOM_SEED, 123457,

IMSLS_LOSS_VALUE, &loss_value,

0);

 

printf("Fitted values vs actuals:\n");

for (i = 0; i < ROW; i++){

printf("\t%5.3f %5.3f\n", fitted_values[i], XY[i][response_col_idx]);

}

 

printf("\nLoss value: %5.5f\n", loss_value);

imsls_free(fitted_values);

 

}

Output

 

Fitted values vs actuals:

4.956 4.456

2.908 3.019

5.105 5.169

0.229 -0.231

2.124 2.431

2.338 2.283

4.333 4.517

5.273 5.430

0.734 0.996

1.491 1.235

1.359 1.516

2.611 2.729

2.275 3.070

1.875 1.817

3.233 3.760

6.246 5.724

3.676 3.782

3.981 3.600

3.742 4.302

0.716 -0.192

3.367 3.032

1.975 1.567

3.105 2.775

0.471 1.050

2.386 2.734

1.307 1.790

1.370 1.103

1.709 1.709

2.371 2.180

3.386 2.461

1.404 1.922

5.822 5.819

2.207 2.048

4.028 4.540

3.831 3.661

4.824 4.393

3.451 3.233

3.451 3.138

2.285 1.490

0.471 0.095

3.728 3.745

3.204 3.242

1.474 1.978

4.291 4.152

0.937 0.780

1.709 2.019

0.534 0.563

5.608 4.743

3.376 2.980

2.749 2.822

5.656 5.244

3.192 3.173

1.659 0.782

1.851 1.578

0.435 0.841

2.880 2.636

1.869 1.309

2.201 2.731

3.517 3.568

5.263 5.402

3.584 3.949

 

Loss value: 0.08452

Example 2

This example uses stochastic gradient boosting to obtain probability estimates for a binary response variable and four predictor variables. An estimate of P[Y = 0] is obtained for each example in the training data as well as a small test data set.

Probabilities  ≤ 0.5 lead to a prediction of Y = 0, while probabilities  > 0.5 lead to a prediction of Y = 1.0.

 

#include <imsls.h>

#include <stdio.h>

 

#define TROW 100

#define TCOL 5

#define TSTROW 10

 

int main(){

float training_data[TROW][TCOL] = {

{ 0.0, 0.4223019897, 1.7540411302, 3.0, 0.763836258 },

{ 0.0, 0.0907259332, 0.8722643796, 2.0, 1.859006285 },

{ 0.0, 0.1384744535, 0.838324877, 1.0, 0.249729405 },

{ 1.0, 0.5435024537, 1.2359190206, 4.0, 0.831992314 },

{ 0.0, 0.8359154933, 1.8527500411, 1.0, 1.089201049 },

{ 1.0, 0.3577950741, 0.3652825342, 3.0, 2.204364955 },

{ 1.0, 0.6799094002, 0.6610595905, 3.0, 1.44730419 },

{ 0.0, 0.5821297709, 1.6180879478, 1.0, 2.957565282 },

{ 1.0, 0.8229457375, 1.0201675948, 3.0, 2.872570117 },

{ 0.0, 0.0633462721, 0.4140600134, 1.0, 0.63906323 },

{ 1.0, 0.1019134156, 0.0677204356, 3.0, 1.493447564 },

{ 0.0, 0.1551713238, 1.541201456, 3.0, 1.90219884 },

{ 1.0, 0.8273822817, 0.2114979578, 3.0, 2.855730173 },

{ 0.0, 0.7955570114, 1.8757067556, 2.0, 2.930132627 },

{ 0.0, 0.6537275917, 1.2139678737, 2.0, 1.535853243 },

{ 1.0, 0.1243124125, 1.5130919744, 4.0, 2.733670775 },

{ 0.0, 0.2163864174, 0.7051185896, 2.0, 2.755841087 },

{ 0.0, 0.2522670308, 1.2821007571, 2.0, 0.342119491 },

{ 0.0, 0.8677104027, 1.9003869346, 2.0, 2.454376481 },

{ 1.0, 0.8670932774, 0.7993045617, 4.0, 2.732812615 },

{ 0.0, 0.5384287981, 0.1856947718, 1.0, 1.838702635 },

{ 0.0, 0.7236269342, 0.4993310347, 1.0, 1.030699128 },

{ 0.0, 0.0789361731, 1.011216166, 1.0, 2.539607478 },

{ 1.0, 0.7631686032, 0.0536725423, 2.0, 1.401761686 },

{ 0.0, 0.1157020777, 0.0123261618, 1.0, 2.098372295 },

{ 1.0, 0.1451248352, 1.9153951635, 3.0, 0.492650534 },

{ 1.0, 0.8497178114, 1.80941298, 4.0, 2.653985489 },

{ 0.0, 0.8027864883, 1.2631045617, 3.0, 2.716214291 },

{ 0.0, 0.798560373, 0.6872106791, 2.0, 2.763023936 },

{ 1.0, 0.1816879204, 0.4323868025, 4.0, 0.098090197 },

{ 1.0, 0.6301239238, 0.3670980479, 3.0, 0.02313788 },

{ 1.0, 0.0411311248, 0.0173408454, 3.0, 1.994786958 },

{ 1.0, 0.0427366099, 0.8114635572, 3.0, 2.966069741 },

{ 1.0, 0.4107826762, 0.1929467283, 4.0, 0.573832348 },

{ 0.0, 0.9441903098, 0.0729898885, 1.0, 1.710992303 },

{ 1.0, 0.3597549822, 0.2799857073, 2.0, 0.969428934 },

{ 0.0, 0.3741368004, 1.6052779425, 2.0, 1.866030486 },

{ 0.0, 0.3515911719, 0.3383029872, 1.0, 2.639469598 },

{ 0.0, 0.9184092905, 1.7116801264, 1.0, 1.380178652 },

{ 1.0, 0.77803064, 1.9830028405, 3.0, 1.834021992 },

{ 0.0, 0.573786814, 0.0258851023, 1.0, 1.52130144 },

{ 1.0, 0.3279244492, 0.6977945678, 4.0, 1.322451157 },

{ 0.0, 0.7924819048, 0.3694838509, 1.0, 2.369654865 },

{ 0.0, 0.9787846403, 1.1470323382, 2.0, 0.037156113 },

{ 1.0, 0.6910662795, 0.1019420708, 2.0, 2.58588334 },

{ 0.0, 0.1367050812, 0.6635301332, 2.0, 0.368273583 },

{ 0.0, 0.2826360366, 1.4468787988, 1.0, 2.705811968 },

{ 0.0, 0.4524727969, 0.7885378413, 2.0, 0.851228449 },

{ 0.0, 0.5118664701, 1.061143666, 1.0, 0.249325278 },

{ 0.0, 0.9965170731, 0.2068265025, 2.0, 0.9210639 },

{ 1.0, 0.7801500652, 1.565742691, 4.0, 1.827419217 },

{ 0.0, 0.2906187973, 1.7036567871, 2.0, 2.842997725 },

{ 0.0, 0.1753704017, 0.7124397112, 2.0, 1.262811961 },

{ 1.0, 0.7796778064, 0.3478030777, 3.0, 0.90719801 },

{ 1.0, 0.3889356288, 1.1771452101, 4.0, 1.298438454 },

{ 0.0, 0.9374473374, 1.1879778663, 1.0, 1.854424331 },

{ 1.0, 0.1939157653, 0.093336341, 4.0, 0.166025681 },

{ 1.0, 0.2023756928, 0.0623724433, 3.0, 0.536441906 },

{ 0.0, 0.1691352043, 1.1587338657, 2.0, 2.15494096 },

{ 1.0, 0.0921523357, 0.2247394961, 3.0, 2.006995301 },

{ 0.0, 0.819186907, 0.0392292971, 1.0, 1.282159743 },

{ 0.0, 0.9458126165, 1.5268264762, 1.0, 1.960050194 },

{ 0.0, 0.1373939656, 1.8025095677, 2.0, 0.633624267 },

{ 0.0, 0.0555424779, 0.5022063241, 2.0, 0.639495004 },

{ 1.0, 0.3581428374, 1.4436954968, 3.0, 1.408938169 },

{ 1.0, 0.1189418568, 0.8011626904, 4.0, 0.210266769 },

{ 1.0, 0.5782070206, 1.58215921, 3.0, 2.648622607 },

{ 0.0, 0.460689794, 0.0704823257, 1.0, 1.45671379 },

{ 0.0, 0.6959878858, 0.2245675903, 2.0, 1.849515461 },

{ 0.0, 0.1930288749, 0.6296302159, 2.0, 2.597390946 },

{ 0.0, 0.4912149447, 0.0713489084, 1.0, 0.426487798 },

{ 0.0, 0.3496920248, 1.0135462089, 1.0, 2.962295362 },

{ 1.0, 0.7716284667, 0.5387295927, 4.0, 0.736709363 },

{ 1.0, 0.3463061263, 0.7819578522, 4.0, 1.597238498 },

{ 1.0, 0.6897138762, 1.2793166582, 4.0, 2.376281484 },

{ 0.0, 0.2818824656, 1.4379718141, 3.0, 2.627468417 },

{ 0.0, 0.5659798421, 1.6243568249, 1.0, 1.624809581 },

{ 0.0, 0.7965560518, 0.3933029529, 2.0, 0.415849269 },

{ 0.0, 0.9156922165, 1.0465683565, 1.0, 2.802914008 },

{ 0.0, 0.8299879942, 1.2237155279, 1.0, 2.611676934 },

{ 0.0, 0.0241912066, 1.9213823564, 1.0, 0.659596571 },

{ 0.0, 0.0948590154, 0.3609640412, 1.0, 1.287687748 },

{ 0.0, 0.230467916, 1.9421709292, 3.0, 2.290064565 },

{ 0.0, 0.2209760561, 0.4812708795, 1.0, 1.862393057 },

{ 0.0, 0.4704530933, 0.2644400774, 1.0, 1.960189529 },

{ 1.0, 0.1986645423, 0.48924731, 2.0, 0.333790415 },

{ 0.0, 0.9201823308, 1.4247304946, 1.0, 0.367654009 },

{ 1.0, 0.8118424334, 0.1017034058, 2.0, 2.001390385 },

{ 1.0, 0.1347265388, 0.1362061207, 3.0, 1.151431168 },

{ 0.0, 0.9884603191, 1.5700038988, 2.0, 0.717332943 },

{ 0.0, 0.1964012324, 0.4306495111, 1.0, 1.689056823 },

{ 1.0, 0.4031848807, 1.1251849262, 4.0, 1.977734922 },

{ 1.0, 0.0341882701, 0.3717348906, 4.0, 1.830587439 },

{ 0.0, 0.5073120815, 1.7860476542, 3.0, 0.142862822 },

{ 0.0, 0.6363195451, 0.6631249222, 2.0, 1.211148724 },

{ 1.0, 0.1642774614, 1.1963615627, 3.0, 0.843113448 },

{ 0.0, 0.0945515088, 1.8669327218, 1.0, 2.417198514 },

{ 0.0, 0.2364508687, 1.4035215094, 2.0, 2.964026097 },

{ 1.0, 0.7490112646, 0.1778408242, 4.0, 2.343119453 },

{ 1.0, 0.5193473259, 0.3090019161, 3.0, 1.300277323 }

};

 

float test_data[TSTROW][TCOL] = {

{ 0.0, 0.0093314846, 0.0315045565, 1.0, 2.043737003 },

{ 0.0, 0.0663379349, 0.0822378928, 2.0, 1.202557951 },

{ 1.0, 0.9728333529, 0.8778284262, 4.0, 0.205940753 },

{ 1.0, 0.7655418115, 0.3292853828, 4.0, 2.940793653 },

{ 1.0, 0.1610695978, 0.3832762009, 4.0, 1.96753633 },

{ 0.0, 0.0849463812, 1.4988451041, 2.0, 2.307902221 },

{ 0.0, 0.7932621511, 1.2098399368, 1.0, 0.886761862 },

{ 0.0, 0.1336030525, 0.2794256401, 2.0, 2.672175208 },

{ 0.0, 0.4758480834, 0.0441179522, 1.0, 0.399722717 },

{ 1.0, 0.1137434335, 0.922533263, 3.0, 1.927635631 }

};

 

int i;

int n_classes = 2;

int var_type[] = { 0, 2, 2, 0, 2 };

/* min_n_node, min_split, max_x_cats, max_size, max_depth*/

int tree_control_params[] = { 10, 21, 10, 4, 10 };

int response_col_idx = 0;

float *predicted_values = NULL;

float *fitted_values = NULL;

float *probabilities = NULL;

float *fitted_probabilities = NULL;

float loss_value;

float test_loss_value;

 

 

predicted_values = imsls_f_gradient_boosting(TROW, TCOL, &training_data[0][0],

response_col_idx,

var_type,

IMSLS_SHRINKAGE, 0.05,

IMSLS_RANDOM_SEED, 123457,

IMSLS_LOSS_VALUE, &loss_value,

IMSLS_TEST_LOSS_VALUE, &test_loss_value,

IMSLS_CONTROL, tree_control_params,

IMSLS_TEST_DATA, TSTROW, &test_data[0][0],

IMSLS_FITTED_VALUES, &fitted_values,

IMSLS_PROBABILITIES, &probabilities,

IMSLS_FITTED_PROBABILITIES, &fitted_probabilities,

0);

 

printf("Training data fitted prob[Y=0] and actuals:\n");

for (i = 0; i < TROW; i++){

printf("\t%3.2f %3.0f\n ", fitted_probabilities[i*n_classes],

training_data[i][response_col_idx]);

}

printf("\nTraining data loss_value=%f\n\n", loss_value);

 

printf("Test data predicted prob[Y=0] and actuals:\n");

for (i = 0; i < TSTROW; i++){

printf("\t%3.2f %3.0f\n", probabilities[i*n_classes],

test_data[i][response_col_idx]);

}

printf("\nTest data loss value=%f\n", test_loss_value);

 

imsls_free(predicted_values);

imsls_free(fitted_values);

imsls_free(probabilities);

imsls_free(fitted_probabilities);

}

Output

 

Training data fitted prob[Y=0] and actuals:

0.35 0

0.82 0

0.87 0

0.25 1

0.90 0

0.24 1

0.26 1

0.90 0

0.30 1

0.84 0

0.23 1

0.35 0

0.24 1

0.85 0

0.84 0

0.26 1

0.82 0

0.85 0

0.85 0

0.22 1

0.83 0

0.85 0

0.87 0

0.75 1

0.83 0

0.35 1

0.26 1

0.35 0

0.81 0

0.18 1

0.24 1

0.23 1

0.30 1

0.17 1

0.83 0

0.76 1

0.85 0

0.83 0

0.90 0

0.35 1

0.83 0

0.21 1

0.84 0

0.83 0

0.75 1

0.81 0

0.90 0

0.82 0

0.87 0

0.76 0

0.26 1

0.85 0

0.82 0

0.24 1

0.24 1

0.89 0

0.16 1

0.23 1

0.83 0

0.24 1

0.83 0

0.90 0

0.85 0

0.78 0

0.35 1

0.22 1

0.35 1

0.83 0

0.76 0

0.78 0

0.83 0

0.87 0

0.18 1

0.22 1

0.26 1

0.35 0

0.90 0

0.77 0

0.87 0

0.89 0

0.90 0

0.83 0

0.35 0

0.84 0

0.83 0

0.77 1

0.90 0

0.75 1

0.23 1

0.85 0

0.84 0

0.22 1

0.18 1

0.35 0

0.81 0

0.32 1

0.90 0

0.85 0

0.16 1

0.24 1

 

Training data loss_value=0.650631

 

Test data predicted prob[Y=0] and actuals:

0.83 0

0.75 0

0.22 1

0.17 1

0.18 1

0.85 0

0.89 0

0.76 0

0.83 0

0.30 1

 

Test data loss value=0.440048

Example 3

This example uses the same data as in Example 2, but switches the response variable to the 4th column of the training data. Because the response is categorical with more than two categories, the multinomial loss function is used.

Note: The response variable is considered to have five categorical levels because its largest value is 4, but the code assumes categorical variables start in '0'. Since '0' is not present in the data, a warning message is printed.

 

#include <imsls.h>

#include <stdio.h>

 

#define TROW 100

#define TCOL 5

#define TSTROW 10

 

int main(){

 

float training_data[TROW][TCOL] = {

{ 0.0, 0.4223019897, 1.7540411302, 3.0, 0.763836258 },

{ 0.0, 0.0907259332, 0.8722643796, 2.0, 1.859006285 },

{ 0.0, 0.1384744535, 0.838324877, 1.0, 0.249729405 },

{ 1.0, 0.5435024537, 1.2359190206, 4.0, 0.831992314 },

{ 0.0, 0.8359154933, 1.8527500411, 1.0, 1.089201049 },

{ 1.0, 0.3577950741, 0.3652825342, 3.0, 2.204364955 },

{ 1.0, 0.6799094002, 0.6610595905, 3.0, 1.44730419 },

{ 0.0, 0.5821297709, 1.6180879478, 1.0, 2.957565282 },

{ 1.0, 0.8229457375, 1.0201675948, 3.0, 2.872570117 },

{ 0.0, 0.0633462721, 0.4140600134, 1.0, 0.63906323 },

{ 1.0, 0.1019134156, 0.0677204356, 3.0, 1.493447564 },

{ 0.0, 0.1551713238, 1.541201456, 3.0, 1.90219884 },

{ 1.0, 0.8273822817, 0.2114979578, 3.0, 2.855730173 },

{ 0.0, 0.7955570114, 1.8757067556, 2.0, 2.930132627 },

{ 0.0, 0.6537275917, 1.2139678737, 2.0, 1.535853243 },

{ 1.0, 0.1243124125, 1.5130919744, 4.0, 2.733670775 },

{ 0.0, 0.2163864174, 0.7051185896, 2.0, 2.755841087 },

{ 0.0, 0.2522670308, 1.2821007571, 2.0, 0.342119491 },

{ 0.0, 0.8677104027, 1.9003869346, 2.0, 2.454376481 },

{ 1.0, 0.8670932774, 0.7993045617, 4.0, 2.732812615 },

{ 0.0, 0.5384287981, 0.1856947718, 1.0, 1.838702635 },

{ 0.0, 0.7236269342, 0.4993310347, 1.0, 1.030699128 },

{ 0.0, 0.0789361731, 1.011216166, 1.0, 2.539607478 },

{ 1.0, 0.7631686032, 0.0536725423, 2.0, 1.401761686 },

{ 0.0, 0.1157020777, 0.0123261618, 1.0, 2.098372295 },

{ 1.0, 0.1451248352, 1.9153951635, 3.0, 0.492650534 },

{ 1.0, 0.8497178114, 1.80941298, 4.0, 2.653985489 },

{ 0.0, 0.8027864883, 1.2631045617, 3.0, 2.716214291 },

{ 0.0, 0.798560373, 0.6872106791, 2.0, 2.763023936 },

{ 1.0, 0.1816879204, 0.4323868025, 4.0, 0.098090197 },

{ 1.0, 0.6301239238, 0.3670980479, 3.0, 0.02313788 },

{ 1.0, 0.0411311248, 0.0173408454, 3.0, 1.994786958 },

{ 1.0, 0.0427366099, 0.8114635572, 3.0, 2.966069741 },

{ 1.0, 0.4107826762, 0.1929467283, 4.0, 0.573832348 },

{ 0.0, 0.9441903098, 0.0729898885, 1.0, 1.710992303 },

{ 1.0, 0.3597549822, 0.2799857073, 2.0, 0.969428934 },

{ 0.0, 0.3741368004, 1.6052779425, 2.0, 1.866030486 },

{ 0.0, 0.3515911719, 0.3383029872, 1.0, 2.639469598 },

{ 0.0, 0.9184092905, 1.7116801264, 1.0, 1.380178652 },

{ 1.0, 0.77803064, 1.9830028405, 3.0, 1.834021992 },

{ 0.0, 0.573786814, 0.0258851023, 1.0, 1.52130144 },

{ 1.0, 0.3279244492, 0.6977945678, 4.0, 1.322451157 },

{ 0.0, 0.7924819048, 0.3694838509, 1.0, 2.369654865 },

{ 0.0, 0.9787846403, 1.1470323382, 2.0, 0.037156113 },

{ 1.0, 0.6910662795, 0.1019420708, 2.0, 2.58588334 },

{ 0.0, 0.1367050812, 0.6635301332, 2.0, 0.368273583 },

{ 0.0, 0.2826360366, 1.4468787988, 1.0, 2.705811968 },

{ 0.0, 0.4524727969, 0.7885378413, 2.0, 0.851228449 },

{ 0.0, 0.5118664701, 1.061143666, 1.0, 0.249325278 },

{ 0.0, 0.9965170731, 0.2068265025, 2.0, 0.9210639 },

{ 1.0, 0.7801500652, 1.565742691, 4.0, 1.827419217 },

{ 0.0, 0.2906187973, 1.7036567871, 2.0, 2.842997725 },

{ 0.0, 0.1753704017, 0.7124397112, 2.0, 1.262811961 },

{ 1.0, 0.7796778064, 0.3478030777, 3.0, 0.90719801 },

{ 1.0, 0.3889356288, 1.1771452101, 4.0, 1.298438454 },

{ 0.0, 0.9374473374, 1.1879778663, 1.0, 1.854424331 },

{ 1.0, 0.1939157653, 0.093336341, 4.0, 0.166025681 },

{ 1.0, 0.2023756928, 0.0623724433, 3.0, 0.536441906 },

{ 0.0, 0.1691352043, 1.1587338657, 2.0, 2.15494096 },

{ 1.0, 0.0921523357, 0.2247394961, 3.0, 2.006995301 },

{ 0.0, 0.819186907, 0.0392292971, 1.0, 1.282159743 },

{ 0.0, 0.9458126165, 1.5268264762, 1.0, 1.960050194 },

{ 0.0, 0.1373939656, 1.8025095677, 2.0, 0.633624267 },

{ 0.0, 0.0555424779, 0.5022063241, 2.0, 0.639495004 },

{ 1.0, 0.3581428374, 1.4436954968, 3.0, 1.408938169 },

{ 1.0, 0.1189418568, 0.8011626904, 4.0, 0.210266769 },

{ 1.0, 0.5782070206, 1.58215921, 3.0, 2.648622607 },

{ 0.0, 0.460689794, 0.0704823257, 1.0, 1.45671379 },

{ 0.0, 0.6959878858, 0.2245675903, 2.0, 1.849515461 },

{ 0.0, 0.1930288749, 0.6296302159, 2.0, 2.597390946 },

{ 0.0, 0.4912149447, 0.0713489084, 1.0, 0.426487798 },

{ 0.0, 0.3496920248, 1.0135462089, 1.0, 2.962295362 },

{ 1.0, 0.7716284667, 0.5387295927, 4.0, 0.736709363 },

{ 1.0, 0.3463061263, 0.7819578522, 4.0, 1.597238498 },

{ 1.0, 0.6897138762, 1.2793166582, 4.0, 2.376281484 },

{ 0.0, 0.2818824656, 1.4379718141, 3.0, 2.627468417 },

{ 0.0, 0.5659798421, 1.6243568249, 1.0, 1.624809581 },

{ 0.0, 0.7965560518, 0.3933029529, 2.0, 0.415849269 },

{ 0.0, 0.9156922165, 1.0465683565, 1.0, 2.802914008 },

{ 0.0, 0.8299879942, 1.2237155279, 1.0, 2.611676934 },

{ 0.0, 0.0241912066, 1.9213823564, 1.0, 0.659596571 },

{ 0.0, 0.0948590154, 0.3609640412, 1.0, 1.287687748 },

{ 0.0, 0.230467916, 1.9421709292, 3.0, 2.290064565 },

{ 0.0, 0.2209760561, 0.4812708795, 1.0, 1.862393057 },

{ 0.0, 0.4704530933, 0.2644400774, 1.0, 1.960189529 },

{ 1.0, 0.1986645423, 0.48924731, 2.0, 0.333790415 },

{ 0.0, 0.9201823308, 1.4247304946, 1.0, 0.367654009 },

{ 1.0, 0.8118424334, 0.1017034058, 2.0, 2.001390385 },

{ 1.0, 0.1347265388, 0.1362061207, 3.0, 1.151431168 },

{ 0.0, 0.9884603191, 1.5700038988, 2.0, 0.717332943 },

{ 0.0, 0.1964012324, 0.4306495111, 1.0, 1.689056823 },

{ 1.0, 0.4031848807, 1.1251849262, 4.0, 1.977734922 },

{ 1.0, 0.0341882701, 0.3717348906, 4.0, 1.830587439 },

{ 0.0, 0.5073120815, 1.7860476542, 3.0, 0.142862822 },

{ 0.0, 0.6363195451, 0.6631249222, 2.0, 1.211148724 },

{ 1.0, 0.1642774614, 1.1963615627, 3.0, 0.843113448 },

{ 0.0, 0.0945515088, 1.8669327218, 1.0, 2.417198514 },

{ 0.0, 0.2364508687, 1.4035215094, 2.0, 2.964026097 },

{ 1.0, 0.7490112646, 0.1778408242, 4.0, 2.343119453 },

{ 1.0, 0.5193473259, 0.3090019161, 3.0, 1.300277323 }

};

 

float test_data[TSTROW][TCOL] = {

{ 0.0, 0.0093314846, 0.0315045565, 1.0, 2.043737003 },

{ 0.0, 0.0663379349, 0.0822378928, 2.0, 1.202557951 },

{ 1.0, 0.9728333529, 0.8778284262, 4.0, 0.205940753 },

{ 1.0, 0.7655418115, 0.3292853828, 4.0, 2.940793653 },

{ 1.0, 0.1610695978, 0.3832762009, 4.0, 1.96753633 },

{ 0.0, 0.0849463812, 1.4988451041, 2.0, 2.307902221 },

{ 0.0, 0.7932621511, 1.2098399368, 1.0, 0.886761862 },

{ 0.0, 0.1336030525, 0.2794256401, 2.0, 2.672175208 },

{ 0.0, 0.4758480834, 0.0441179522, 1.0, 0.399722717 },

{ 1.0, 0.1137434335, 0.922533263, 3.0, 1.927635631 }

};

 

int i, j;

int n_classes = 5;

int response_col_idx = 3;

int var_type[] = { 0, 2, 2, 0, 2 };

float *predicted_values = NULL;

float *fitted_values = NULL;

float *probabilities = NULL;

float *fitted_probabilities = NULL;

float loss_value;

float test_loss_value;

/* min_n_node, min_split, max_x_cats, max_size, max_depth*/

int tree_control_params[] = { 10, 21, 10, 4, 10 };

 

 

predicted_values = imsls_f_gradient_boosting(TROW, TCOL, &training_data[0][0],

response_col_idx,

var_type,

IMSLS_SHRINKAGE, 0.05,

IMSLS_RANDOM_SEED, 123457,

IMSLS_LOSS_VALUE, &loss_value,

IMSLS_CONTROL, tree_control_params,

IMSLS_TEST_LOSS_VALUE, &test_loss_value,

IMSLS_TEST_DATA, TSTROW, &test_data[0][0],

IMSLS_FITTED_VALUES, &fitted_values,

IMSLS_PROBABILITIES, &probabilities,

IMSLS_FITTED_PROBABILITIES, &fitted_probabilities,

0);

 

 

printf("Training data fitted probabilities and actuals:\n\n");

printf("Class: ");

for (j = 0; j < n_classes; j++){

printf("\t %d ", j);

}

printf("\tActual\n");

for (i = 0; i < TROW; i++){

for (j = 0; j < n_classes; j++){

printf("\t%3.2f ", fitted_probabilities[i*n_classes + j]);

}

printf(" %3.0f\n", training_data[i][response_col_idx]);

}

printf("\nTraining data loss value=%f\n\n", loss_value);

 

printf("Test data predicted probabilities and actuals:\n\n");

printf("Class: ");

for (j = 0; j < n_classes; j++){

printf("\t %d ", j);

}

printf("\tActual\n");

for (i = 0; i < TSTROW; i++){

for (j = 0; j < n_classes; j++){

printf("\t%3.2f ", probabilities[i*n_classes + j]);

}

printf(" %3.0f\n", test_data[i][response_col_idx]);

}

printf("\nTest data loss value=%f\n\n", test_loss_value);

 

imsls_free(predicted_values);

imsls_free(fitted_values);

imsls_free(probabilities);

imsls_free(fitted_probabilities);

 

}

 

Output

 

*** WARNING Error IMSLS_EMPTY_CLASS_LEVEL from imsls_f_gradient_boosting.

*** The count of class level 0 in the training data is zero.

 

 

Training data fitted probabilities and actuals:

 

Class: 0 1 2 3 4 Actual

0.02 0.39 0.35 0.17 0.06 3

0.02 0.45 0.33 0.14 0.06 2

0.02 0.39 0.40 0.13 0.06 1

0.02 0.06 0.16 0.36 0.40 4

0.02 0.44 0.34 0.15 0.05 1

0.02 0.10 0.17 0.40 0.32 3

0.02 0.07 0.19 0.32 0.39 3

0.02 0.45 0.29 0.17 0.06 1

0.02 0.08 0.19 0.32 0.39 3

0.02 0.46 0.34 0.12 0.05 1

0.02 0.09 0.16 0.43 0.29 3

0.02 0.42 0.30 0.20 0.06 3

0.02 0.11 0.19 0.37 0.31 3

0.02 0.43 0.33 0.16 0.05 2

0.02 0.43 0.36 0.13 0.05 2

0.02 0.06 0.13 0.46 0.34 4

0.02 0.44 0.35 0.13 0.06 2

0.02 0.37 0.39 0.15 0.07 2

0.02 0.46 0.32 0.15 0.05 2

0.02 0.08 0.19 0.31 0.39 4

0.02 0.55 0.28 0.12 0.04 1

0.02 0.46 0.38 0.10 0.05 1

0.02 0.45 0.33 0.14 0.06 1

0.02 0.10 0.20 0.37 0.31 2

0.02 0.52 0.28 0.14 0.04 1

0.02 0.05 0.15 0.43 0.36 3

0.02 0.07 0.16 0.40 0.35 4

0.02 0.44 0.35 0.14 0.05 3

0.02 0.45 0.37 0.11 0.05 2

0.02 0.08 0.18 0.32 0.40 4

0.02 0.09 0.20 0.35 0.34 3

0.02 0.09 0.16 0.43 0.30 3

0.02 0.07 0.16 0.37 0.38 3

0.02 0.08 0.19 0.37 0.34 4

0.02 0.55 0.30 0.10 0.03 1

0.02 0.09 0.19 0.39 0.31 2

0.02 0.44 0.31 0.18 0.06 2

0.02 0.52 0.29 0.13 0.04 1

0.02 0.47 0.32 0.15 0.05 1

0.02 0.07 0.16 0.41 0.35 3

0.02 0.55 0.28 0.12 0.04 1

0.02 0.07 0.17 0.34 0.40 4

0.02 0.53 0.31 0.10 0.04 1

0.02 0.41 0.41 0.10 0.06 2

0.02 0.10 0.19 0.37 0.31 2

0.02 0.39 0.40 0.13 0.06 2

0.02 0.43 0.31 0.18 0.06 1

0.02 0.41 0.39 0.12 0.06 2

0.02 0.41 0.38 0.12 0.06 1

0.02 0.50 0.35 0.10 0.03 2

0.02 0.07 0.16 0.41 0.35 4

0.02 0.43 0.31 0.18 0.06 2

0.02 0.44 0.35 0.13 0.06 2

0.02 0.09 0.22 0.36 0.31 3

0.02 0.07 0.16 0.34 0.40 4

0.02 0.48 0.35 0.11 0.05 1

0.02 0.08 0.19 0.37 0.34 4

0.02 0.08 0.19 0.37 0.34 3

0.02 0.44 0.34 0.14 0.06 2

0.02 0.09 0.16 0.43 0.30 3

0.02 0.54 0.30 0.10 0.03 1

0.02 0.47 0.32 0.14 0.05 1

0.02 0.37 0.36 0.19 0.06 2

0.02 0.41 0.38 0.13 0.05 2

0.02 0.06 0.14 0.43 0.36 3

0.02 0.06 0.18 0.34 0.40 4

0.02 0.06 0.13 0.43 0.36 3

0.02 0.55 0.28 0.12 0.04 1

0.02 0.52 0.32 0.11 0.04 2

0.02 0.44 0.34 0.13 0.06 2

0.02 0.48 0.33 0.12 0.04 1

0.02 0.45 0.34 0.13 0.06 1

0.02 0.07 0.21 0.29 0.41 4

0.02 0.07 0.17 0.34 0.40 4

0.02 0.07 0.17 0.37 0.37 4

0.02 0.42 0.31 0.18 0.06 3

0.02 0.46 0.29 0.17 0.06 1

0.02 0.46 0.38 0.09 0.05 2

0.02 0.48 0.34 0.11 0.05 1

0.02 0.47 0.34 0.12 0.05 1

0.02 0.38 0.35 0.19 0.06 1

0.02 0.52 0.29 0.14 0.04 1

0.02 0.43 0.31 0.18 0.06 3

0.02 0.51 0.30 0.12 0.05 1

0.02 0.55 0.27 0.12 0.04 1

0.02 0.07 0.19 0.32 0.40 2

0.02 0.41 0.38 0.14 0.06 1

0.02 0.11 0.19 0.37 0.31 2

0.02 0.09 0.17 0.43 0.29 3

0.02 0.41 0.37 0.14 0.06 2

0.02 0.52 0.30 0.12 0.05 1

0.02 0.07 0.16 0.35 0.40 4

0.02 0.09 0.16 0.40 0.33 4

0.02 0.40 0.35 0.17 0.06 3

0.02 0.42 0.39 0.11 0.05 2

0.02 0.05 0.17 0.40 0.36 3

0.02 0.42 0.30 0.20 0.06 1

0.02 0.43 0.33 0.16 0.06 2

0.02 0.10 0.19 0.37 0.31 4

0.02 0.10 0.16 0.39 0.32 3

 

Training data loss value=0.992967

 

Test data predicted probabilities and actuals:

 

Class: 0 1 2 3 4 Actual

0.02 0.52 0.28 0.14 0.04 1

0.02 0.50 0.31 0.14 0.04 2

0.02 0.07 0.22 0.28 0.41 4

0.02 0.10 0.19 0.37 0.32 4

0.02 0.09 0.16 0.39 0.34 4

0.02 0.43 0.30 0.19 0.06 2

0.02 0.39 0.41 0.12 0.05 1

0.02 0.51 0.29 0.14 0.04 2

0.02 0.48 0.33 0.12 0.04 1

0.02 0.07 0.16 0.37 0.38 3

 

Test data loss value=1.006980

 

Warning Errors

IMSLS_NO_PREDICTORS

The model has no predictors.

IMSLS_INVALID_LOSS_FCN

The loss function type # is invalid for a response variable of type #. Resetting to loss function type # = "#".

IMSLS_EMPTY_CLASS_LEVEL

The count of class level # in the training data is zero.