Kellogg School of Management

Finance 520-1



Basic Time Series Analysis


SPRING, 2011


The purpose of the course is to introduce, primarily at an intuitive level, the basic concepts of time series analysis, as applied in finance and economics. Topics will include:


·      Basic concepts and models: ergodicity, stationarity, ARMA models, Vector Autoregressions, ARCH/GARCH

·      Foundational results for time series models: Wold and Spectral Decomposition Theorems

·      Filtering tools: Kalman Filter and other filters such as the band pass filter and the Hodrick-Prescott filter

·      Statistical Inference:

o  Classical asymptotic theory: laws of large numbers; central limit theorems; sampling results for Generalized Method of Moments and Maximum Likelihood.

o  Bayesian inference: priors and posteriors; the marginal likelihood as a way to assess model fit; Monte Carlo Markov Chain and Gibbs sampling; use of Bayesian methods to achieve parameter parsimony in forecasting models, and for estimating structural models.

·      Nonstationarity in economic time series: unit roots, determination of cointegration rank, deterministic trends.


The textbook is Hamilton, Time Series Analysis. Chapter 11 in Sargent’s Macroeconomic Theory is also very useful.


Midterm and final from 2007:



From 2008



From 2009



From 2010


Take-home final.



The course meets 5:00-6:30 Tuesday and Wednesday, in room 4214, Leverone.

There will be one homework each week, due on Wednesday. The first homework will be due on April 6.

Review sessions will occur on Fridays. The first session will meet April 1, to review the material in Chapter 3 of the text. Subsequent review sessions will focus on the homework due on the previous Wednesday.

Grades will be based on a midterm (30%), homeworks (30%) and a final (40%).

The midterm will be Wednesday, May 4.

TA for the course: Jingling Guan, office hours, office hours Monday 4-5pm, Leverone 406.

My office hours: by appointment.

Course Outline


The first half of the course studies Time Series models and their properties. The second part discusses statistical inference about those models. We will probably not be able to cover all the topics below. Only what we cover in homeworks, discussion sections and lecture will be covered in the exams.


1.   Stochastic Processes (chapter 3)

·      Stationarity, ergodicity

·      White noise, AR, MA, ARMA models

·      Yule Walker equations

2.   Linear Projections (section 4.1, 4.2)

·      Necessity and sufficiency of orthogonality conditions

·      Recursive property of projections

·      Law of Iterated Projections

·      Projections versus regressions

3.   Wold decomposition theorem (page 109; Sargent)

4.   Vector autoregressions (sections 10.1-10.3)

5.   Spectral analysis (Chapter 6, Sargent and Christiano-Fitzgerald, Technical Appendix 1, page 71, and also.)

·      Lag operator notation

·      Autocovariance generating function (section 3.6)

·      Complex numbers, key integral property of complex numbers, inverse Fourier transform

·      Filters and band pass filter

·      Spectral density

·      Spectral decomposition theorem motivated according to band-pass filter approach in Sargent (Hamilton’s alternative approach is pursued in homework)

·      Sims’ approximation error formula and an application to Fuster, Hebert and Laibson.

·      Spectral analysis and the solution to the projection problem.

6.   Kalman filter (sections 13.1, 13.2, 13.6)

·      State Space, Observer System

·      Derivation of the Kalman filter: ‘forward pass’, ‘backward pass’

1.   Maximum likelihood estimation (chapter 5)

·      Gaussian maximum likelihood

·      Conditional versus unconditional maximum likelihood

·      Identification of moving averages

·      Using the Kalman filter to build the Gaussian likelihood (section 13.4)

·      Maximum likelihood in the frequency domain: the asymptotic diagonalization result.

2.   Introduction to generalized method of moments (section 14.1)

·      Basic ideas from asymptotic sampling theory convergence in probability, mean square and distribution; law of large numbers; mean value theorem (chapter 7)

·      Basic setup of GMM: the ‘GMM orthogonality conditions’

·      Asymptotic sampling distribution of a sample mean and spectral density at frequency zero of GMM orthogonality condition

·      Optimal weighting matrix

·      Sampling properties of GMM estimator, hypothesis testing

3.   GMM and Maximum Likelihood (chapter 14)

·      Preliminary: martingale difference sequence

                                                           i.      definition

                                                       ii.      ARCH processes (chapter 21)

                                                   iii.      Some asymptotic results for m.d.s.

·      Demonstration that ML is a special case of GMM

                                                           i.      Using GMM and structure of Gaussian density to derive asymptotic sampling distribution of Gaussian ML

                                                       ii.      Using GMM without applying structure of the likelihood function to derive asymptotic sampling distribution of quasi-ML

·      Asymptotic efficiency of Gaussian ML (Cramer-Rao lower bound) (this discussion is not in the book, but a good background discussion can be found in an econometric text such as, e.g., Theil)

·      Robustness of GMM versus efficiency of ML: example of estimating ARCH processes (chapter 21)

4.   Bayesian inference (chapter 12)

·      Bayes rule, priors, posteriors

·      Estimating a mean using iid observations with known variance, priors and ‘padding data’

·      Markov Chain, Monte Carlo Methods

                                                           i.      Approximating the posterior distribution

                                                       ii.      Approximating the marginal distribution and using it to evaluate model fit

·      Nonstationarity (chapter 18, 19)

·      Trend stationarity

                                                           i.      Experiment: impact on long-term forecast of a shock

                                                       ii.      Breakdown of standard covariance stationary sampling theory

1. some preliminary results concerning sums

2. superconsistency of OLS slope estimator

·        Unit roots

                                                           i.      Experiment: impact on long-term forecast of a shock

·      Distinction between unit roots and trend stationarity based on spectral density at frequency zero

·      Cointegration.

                                                           i.      Defined

                                                       ii.      The proper handling of cointegration in VARs

5.   Very brief review of chapter 22 discussion of time series analysis with changes in.

·      One approach to distinguishing whether recent change in volatility of macroeconomic and finance data reflects change in the transmission mechanism or heteroscedasticity in fundamental disturbances

·      Much data (especially for emerging market economies) ‘looks’ like it reflects periodic, large regime changes, suggesting the importance of chapter 22 methods