IMSL C# Numerical Library

CrossCorrelation.GetStandardErrors Method 

Returns the standard errors of the cross-correlations between the time series x and y.

public double[] GetStandardErrors(
   StdErr stderrMethod
);

Parameters

stderrMethod
An int specifying the method to compute the standard errors of cross-correlations between the time series x and y.

Return Value

A double array of length 2 * maximumLag + 1 containing the standard errors of the cross-correlations between the time series x and y.

Remarks

The standard error of cross-correlations between x and y at lag k, where k = -maximumLag,..., 0, 1,..., maximumLag, corresponds to output array indices 0, 1,..., (2*maximumLag).

Method of computation for standard errors of the cross-correlation is determined by the stderrMethod parameter. If stderrMethod is set to Bartletts, Bartlett's formula is used to compute the standard errors of cross-correlations. If stderrMethod is set to BartlettsNoCC, Bartlett's formula is used to compute the standard errors of cross-correlations, with the assumption of no cross-correlation.

Exceptions

Exception TypeCondition
NonPosVarianceException is thrown if the problem is ill-conditioned.

See Also

CrossCorrelation Class | Imsl.Stat Namespace