Returns the standard errors of the cross-correlations between the time series x and y.
int specifying the method to compute the standard errors of cross-correlations between the time series x and y. A double array of length 2 * maximumLag + 1 containing the standard errors of the cross-correlations between the time series x and y.
The standard error of cross-correlations between x and y at lag k, where k = -maximumLag,..., 0, 1,..., maximumLag, corresponds to output array indices 0, 1,..., (2*maximumLag).
Method of computation for standard errors of the cross-correlation is determined by the stderrMethod parameter. If stderrMethod is set to Bartletts, Bartlett's formula is used to compute the standard errors of cross-correlations. If stderrMethod is set to BartlettsNoCC, Bartlett's formula is used to compute the standard errors of cross-correlations, with the assumption of no cross-correlation.
| Exception Type | Condition |
|---|---|
| NonPosVarianceException | is thrown if the problem is ill-conditioned. |
CrossCorrelation Class | Imsl.Stat Namespace