## A Conceptual Vocalubary of Time Series Analysis

Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z |

**ALL**

## A |
---|

## autocorrelation function (ACF)The autocorrelation function (ACF) is the normalized autocovariance
function defined as $$ \rho
(\tau)=\frac{\gamma(\tau)}{\sqrt{\gamma(0).\gamma(0)}}=\frac{\gamma(\tau)}{\gamma(0)},
$$where \(\tau = |s-t|\) is the lag time, or the amount of time by
which the signal has been shifted. |

## autocovariance functionThe
autocovariance function is defined as the second moment product$$
\gamma (s,t)=cov(x_s,x_t)=E[(x_s- \mu_s)(x_t- \mu_t)]$$ for all \(s\),
\(t\). Note that for all time points s and t. The autocovariance
measures the linear dependence between two points on the same series
observed at different times. Recall from classical statistic \(\gamma
(s,t)=0\), \(x_t\) and \(x_s\) are not linearly related, but there
still may be some dependence structure between them. If, however,
\(x_t\) and \(x_s\) are bivariate normal, \(\gamma (s,t)=0\) ensures
their independence. It is clear that, for \(s = t\), the
autocovariance reduces to the (assumed finite) variance,$$\gamma
(s,t)=E[(x_s- \mu_s)(x_t- \mu_t)^2]=var(x_t).$$ Example:
The white noise series \(w_t\) has \(E(w_t)=0\) and $$ \gamma_w
(s,t)=cov(w_s,w_t)= \left\{
{\begin{array}{ccccccccccccccc}{\sigma^2_w,}&{{s} =
t,}\\{0,}&{{s} \ne t.}\end{array} } \right. $$ |

## C |
---|

## Cauchy–Schwarz inequalityThe Cauchy–Schwarz inequality states that for all vectors |

## cross-correlation functionThe cross-covariance function scaled to live in [-1, 1] is called |

## cross-covariance functionThe cross-covariance function |

## D |
---|

## difference operator
\( \bigtriangledown^d=(1-B)^d \) where we may expand the operator \( (1-B)^d \) algebraically to evaluate for higher integer values of d. When d = 1, we drop it from the notation. |

## G |
---|

## Gaussian white noiseSee |

## M |
---|

## mean functionThe mean function of a time series process is defined as $$\mu_{xt}=E[X_t] $$ provided that it exists. Because\(
E(w_t)=0 \) for all \( t \) and \(\delta _t\) is a constant we have
$$ \mu_{xt}=E[X_t]=\delta _t+\sum\limits_{j = 1}^t {{w_j}}=\delta _t\
$$which is a straight line with slope \( \delta \). |

## R |
---|

## random walkIf the value of the time series at time \( t \) is the value of the
series at time \( t -1\) plus a completely random movement determined by
a white noise process \(w_t\); e.i. if $$x_t=x_{t-1}+w_t$$ or
equivalently $$x_t=\sum\limits_{j = 1}^t {{w_j}}, t=1,2,....$$then the
generating process is called |

## S |
---|

## stationarityIn time series analysis context,
if the time series is one for which the probabilistic behavior of every
collection of values $$ \{x_{t_1},x_{t{_2},...,x_{t_k}}\} $$is identical
to that of the time shifted
series$$\{x_{t_{1+h}},x_{t_{2+h}},...,x_{t_{k+h}}\}$$for all \(
k=1,2,... \), all time points \(t_1,t_2,...,t_k\) and all time shifts \(
h=0, \pm1, \pm2,... \), time series is said to be a It
is difficult to assess strict stationarity from data. Rather than
imposing conditions on all possible distributions of a time series, a
milder version called Stationarity
requires regularity in the mean and autocorrelation functions so that
these quantities (at least) may be estimated by averaging. It should be
clear that a strictly stationary, finite variance, time series is also
stationary. The converse is not true in general. One important case
where stationarity implies strict stationarity is if the time series is
Gaussian [meaning all finite collections of the series are Gaussian]. |