Sets backcasting option.
int
scalar containing the maximum length of backcasting and must be greater than or equal to 0. By default, maxBackcast
= 10. double
scalar containing the tolerance level used to determine convergence of the backcast algorithm. Typically, tolerance
is set to a fraction of an estimate of the standard deviation of the time series. By default, tolerance
= 0.01 * standard deviation of z
. ARMA Class | Imsl.Stat Namespace