Selects the best multiple linear regression models.
For a list of all members of this type, see SelectionRegression Members.
System.Object
Imsl.Stat.SelectionRegression
Public static (Shared in Visual Basic) members of this type are safe for multithreaded operations. Instance members are not guaranteed to be thread-safe.
Class SelectionRegression
finds the best subset regressions for a regression problem with three or more independent variables. Typically, the intercept is forced into all models and is not a candidate variable. In this case, a sum-of-squares and crossproducts matrix for the independent and dependent variables corrected for the mean is computed internally. Optionally, SelectionRegression
supports user-calculated sum-of-squares and crossproducts matrices; see the description of the Compute method.
"Best" is defined by using one of the following three criteria:
Here, n is equal to the sum of the frequencies (or the number of rows in x if frequencies are not specified in the Compute
method), and is the total sum-of-squares. k is the number of candidate or independent variables, represented as the nCandidate argument in the SelectionRegression
constructor. is the error sum-of-squares in a model containing p regression parameters including (or p - 1 of the k candidate variables). Variable
Class SelectionRegression
is based on the algorithm of Furnival and Wilson (1974). This algorithm finds the maximum number of good saved candidate regressions for each possible subset size. For more details, see method MaximumGoodSaved. These regressions are used to identify a set of best regressions. In large problems, many regressions are not computed. They may be rejected without computation based on results for other subsets; this yields an efficient technique for considering all possible regressions.
There are cases when the user may want to input the variance-covariance matrix rather than allow it to be calculated. This can be accomplished using the appropriate Compute
method. Three situations in which the user may want to do this are as follows:
cov
contains one additional row and column corresponding to the constant regressor. This row and column contain the sum-of-squares and crossproducts of the constant regressor with the independent and dependent variables. The remaining elements in cov are the same as in the previous case. Argument nObservations must be set to 1 greater than the number of observations.SelectionRegression
can save considerable CPU time over explicitly computing all possible regressions. However, the function has some limitations that can cause unexpected results for users who are unaware of the limitations of the software.
SelectionRegression
(for ) can produce incorrect results.SelectionRegression
eliminates some subsets of candidate variables by obtaining lower bounds on the error sum-of-squares from fitting larger models. First, the full model containing all independent variables is fit sequentially using a forward stepwise procedure in which one variable enters the model at a time, and criterion values and model numbers for all the candidate variables that can enter at each step are stored. If linearly dependent variables are removed from the full model, a "VariablesDeleted" warning is issued. In this case, some submodels that contain variables removed from the full model because of linear dependency can be overlooked if they have not already been identified during the initial forward stepwise procedure. If this warning is issued and you want the variables that were removed from the full model to be considered in smaller models, you can rerun the program with a set of linearly independent variables.Namespace: Imsl.Stat
Assembly: ImslCS (in ImslCS.dll)
SelectionRegression Members | Imsl.Stat Namespace | Example 1 | Example 2