A Conceptual Vocalubary of Time Series Analysis




Browse the glossary using this index

Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL

Page:  1  2  (Next)
  ALL

A

autocorrelation function (ACF)

The autocorrelation function (ACF) is the normalized autocovariance function defined as $$ \rho (\tau)=\frac{\gamma(\tau)}{\sqrt{\gamma(0).\gamma(0)}}=\frac{\gamma(\tau)}{\gamma(0)}, $$where \(\tau = |s-t|\) is the lag time, or the amount of time by which the signal has been shifted.
The ACF measures the linear predictability of the series at time \( t \), say \(x_t \) using only the value \( \tau \). Using the Cauchy–Schwarz inequality which implies$$ {|\gamma(\tau)|}^2 \leq \gamma(0)^2 $$ it can be shown easily that$$ -1< \rho(\tau)<1. $$

autocovariance function

The autocovariance function is defined as the second moment product$$  \gamma (s,t)=cov(x_s,x_t)=E[(x_s- \mu_s)(x_t- \mu_t)]$$ for all \(s\), \(t\). Note that for all time points  s and t. The autocovariance measures the linear dependence between two points on the same series observed at different times. Recall from classical statistic \(\gamma (s,t)=0\), \(x_t\)  and \(x_s\)  are not linearly related, but there still may be some dependence structure between them. If, however, \(x_t\)  and \(x_s\) are bivariate normal, \(\gamma (s,t)=0\) ensures their independence. It is clear that, for     \(s = t\), the autocovariance reduces to the (assumed finite) variance,$$\gamma (s,t)=E[(x_s- \mu_s)(x_t- \mu_t)^2]=var(x_t).$$
Example: The white noise series \(w_t\) has \(E(w_t)=0\) and $$ \gamma_w (s,t)=cov(w_s,w_t)=  \left\{ {\begin{array}{ccccccccccccccc}{\sigma^2_w,}&{{s} = t,}\\{0,}&{{s} \ne t.}\end{array}  } \right. $$

C

Cauchy–Schwarz inequality

The Cauchy–Schwarz inequality states that for all vectors x and y of an inner product space it is true that $$ |\langle x,y\rangle| ^2 \leq \langle x,x\rangle \cdot \langle y,y\rangle$$  where \( \langle\cdot,\cdot\rangle \) is the inner product, also known as dot product. Equivalently, by taking the square root of both sides, and referring to the norms of the vectors, the inequality is written as$$ |\langle x,y\rangle| \leq \|x\| \cdot \|y\|.$$

cross-correlation function

The cross-covariance function scaled to live in [-1, 1] is called cross-correlation function and given by$$ \rho_{xy} (s,t)=\frac{ \gamma_{xy}(s,t)}{\sqrt{\gamma_x(s,s)\gamma_y(t,t)}}. $$

cross-covariance function

The cross-covariance function between two series, \( x_t \) and \( y_t \) , is$$  \gamma_{xy} (s,t)=cov(x_s,y_t)=E[(x_s- \mu_{xs})(y_t- \mu_{yt})].$$

D

difference operator

Differences of order d are defined as

\( \bigtriangledown^d=(1-B)^d \)

where we may expand the operator \( (1-B)^d \) algebraically to evaluate for higher integer values of d. When d = 1, we drop it from the notation.


G

Gaussian white noise

See white noise.

M

mean function

The mean function of a time series process is defined as  $$\mu_{xt}=E[X_t] $$ provided that it exists.

Example 1: If \( w_t \) denotes a white noise series, then If \( E(w_t)=0 \)  for all \( t \).

Example 2:Consider the random walk with drift model $$x_t= \delta _t+ \frac{1}{n}  \sum\limits_{j = 1}^t {{w_j}},    t=1,2, ...$$

Because\( E(w_t)=0 \)  for all \( t \) and  \(\delta _t\) is a constant we have $$ \mu_{xt}=E[X_t]=\delta _t+\sum\limits_{j = 1}^t {{w_j}}=\delta _t\ $$which is a straight line with slope \( \delta \).

R

random walk

If the value of the time series at time \( t \) is the value of the series at time \( t -1\) plus a completely random movement determined by a white noise process \(w_t\); e.i. if   $$x_t=x_{t-1}+w_t$$ or equivalently $$x_t=\sum\limits_{j = 1}^t {{w_j}},   t=1,2,....$$then the generating process is called random walk.

S

stationarity

In time series analysis context, if the time series is one for which the probabilistic behavior of every collection of values $$ \{x_{t_1},x_{t{_2},...,x_{t_k}}\} $$is identical to that of the time shifted series$$\{x_{t_{1+h}},x_{t_{2+h}},...,x_{t_{k+h}}\}$$for all \( k=1,2,... \), all time points \(t_1,t_2,...,t_k\) and all time shifts \( h=0, \pm1, \pm2,... \), time series is said to be a strictly stationary.

It is difficult to assess strict stationarity from data. Rather than imposing conditions on all possible distributions of a time series,  a milder version called weakly stationary is used that imposes conditions only on the first two moments of the series such as a) the mean value function \( \mu_t \) is constant and does not depend on time \(t\) and b) the autocovariance function depends  on \(s\) and \(t\) only through their difference \(|s-t|\).


Stationarity requires regularity in the mean and autocorrelation functions so that these quantities (at least) may be estimated by averaging. It should be clear that a strictly stationary, finite variance, time series is also stationary. The converse is not true in general. One important case where stationarity implies strict stationarity is if the time series is Gaussian [meaning all finite collections of the series are Gaussian].


Page:  1  2  (Next)
  ALL