Instructor: Prof.Dr.Mustafa Y. Ata   Office hours: By appointment

Timetable:   Thursday 13:00 - 14:10 Lab245 Friday 13:55 - 15:30; Faculty of Science, Class # 205

This course is given at the Gazi University, to final-year statistics undergraduates in the Spring term who have studied at least three courses covering basic probability and statistical inference.

Prerequisites: Mathematical Statistics,  Simulation Techniques, Regression Analysis

Aims The objective of this course is to learn and apply statistical methods for the analysis of data that have been observed over time.  Our challenge in this course is to account for the correlation between measurements that are close in time.  The emphasis will be on practical techniques for data analysis, though appropriate stochastic models for time series will be introduced as necessary to give a firm basis for practical modelling. For the implementation of the methods the programming language R will be used.

Objectives At the end of the course, the student should be able to

  • Compute and interpret a correlogram and a sample spectrum
  • Derive the properties of ARIMA models
  • Choose an appropriate ARIMA model for a given set of data and fit the model using an appropriate package
  • Compute forecasts for a variety of linear methods and models.

Topics covered in this course include methods for:

  • Modeling univariate time series data with Autoregressive and Moving Average Models (denoted as ARIMA models, sometimes called Box Jenkins models).
  • Tools for model identification, model estimation, and assessment of the suitability of the model.
  • Using a model for forecasting and determing prediction intervals for forecasts. 
  • Smoothing methods and trend/seasonal decomposition methods. Smoothing methods include moving averages, exponential smoothing, and Lowess smoothers.
  • Relationships between time series variables, cross correlation, lagged regression models
  • Intervention Analysis (basically before/after analysis of a time series to assess effect of a new policy, treatment, etc.)

Learning outcomes

  • Obtain a technical understanding and appreciation of time series methods.
  • Use the programming language R to apply appropriate models to real data.
  • Perform calculations on forecasting and state space models, using several models.
  • Obtain an appreciation of Bayesian methods in time series.
  • Perform complete statistical analyses to real data and interpret the results.

Teaching methods  Lectures, hands-on exercises,14 lectures.

Assessment  One formal 2 hour written examination and one term project.

The course  materials presented in the course pages consist off some of the topics  selected and reorganized mainly from

with some supplements.

 Software The course assignments and notes will include R code to analyze our data.  Those who are unfamiliar with R (or need to brush up), should take some time to follow through Introduction to R .

Data sets and R codes for the examples in Cryer-Chan(2008) are readily available at

Outline syllabus

  • Examples of time series. Purposes of analysis. Components (trend, cycle, seasonal, irregular). Stationarity and autocorrelation.
  • Approaches to time series analysis. Simple descriptive methods: smoothing, decomposition.
  • Differencing. Autocorrelation. Probability models for stationary series. Autoregressive models.
  • Moving average models. Partial autocorrelation. Invertibility. ARMA processes.
  • ARIMA models for non-stationary series. Identification and fitting. Diagnostics. Ljung-Box statistic.
  • Review and practical examples of model fitting. Introduction to forecasting. Updating and errors.

Full syllabus

  • Week 1: Examples of time series, definition of time series, aims of the module and overview of the methods.
  • Week 2: Descriptive methods for time series, using R in time series, time series plots, sample autocorrelation function, moving averages, the classical decomposition, lag plots.
  • Week 3: Probability models for stationary time series, definition of stationarity (strong and weak stationarity), autoregressive models (AR), infinite representation of AR models, stationarity and causality of AR models, moving average models (MA), invertibility of MA models, ARMA models, autocorrelation function and causality.
  • Week 4: Non-stationary ARMA models (ARIMA).
  • Week 5: Estimation and fitting of ARIMA models, Box-Jenkins approach, identification, fitting, maximum likelihood, least squares estimation, diagnostics and residual analysis, model selection, examples.
  • Week 6: Forecasting, forecasting causal ARMA processes, 1-step ahead and k-step ahead forecasting, prediction intervals, forecasting non-stationary ARIMA and SARIMA processes, linear predictors, exponential smoothing.

Last modified: Sunday, 14 January 2018, 4:21 PM