Kellogg School of Management

Finance 520-1

GENERAL SEMINAR IN Finance

 

Basic Time Series Analysis

 

SPRING, 2009

 

The purpose of the course is to introduce, primarily at an intuitive level, the basic concepts of time series analysis, as applied in finance and economics. Topics will include:

 

·      Basic concepts and models: ergodicity, stationarity, ARMA models, Vector Autoregressions, ARCH/GARCH

·      Foundational results for time series models: Wold and Spectral Decomposition Theorems

·      Filtering tools: Kalman Filter and other filters such as the band pass filter and the Hodrick-Prescott filter

·      Statistical Inference:

o  Classical asymptotic theory: laws of large numbers; central limit theorems; sampling results for Generalized Method of Moments and Maximum Likelihood.

o  Bayesian inference: priors and posteriors; the marginal likelihood as a way to assess model fit; Monte Carlo Markov Chain and Gibbs sampling; use of Bayesian methods to achieve parameter parsimony in forecasting models, and for estimating structural models.

·      Nonstationarity in economic time series: unit roots, determination of cointegration rank, deterministic trends.

·      Identification of impulse response functions.

 

 

The textbook is Hamilton, Time Series Analysis. Chapter 11 in Sargent’s Macroeconomic Theory is also very useful.

 

Midterm and final from 2007:

Midterm

Final

From 2008

Midterm

Final

 

Homework #1

Homework #2

Homework #3

Homework #4

Homework #5 (code, data)

Homework #6 (data)

Homework #7

Homework #8

 

The course meets 2:00-3:15 Mondays, and 4:00-5:45 on Wednesdays, in room 4214, Leverone.

One exception is Monday, March 30 when we meet in Leverone, 165.

There will be one homework each week, due on Wednesday. The first homework will be due on April 8.

Review sessions will occur on Fridays. The first session will meet April 3, to review the material in Chapter 3 of the text. Subsequent review sessions will focus on the homework due on the previous Wednesday.

Grades will be based on a midterm (30%), homeworks (30%) and a final (40%).

The midterm will be Wednesday, May 6.

The May 4 lecture will be given on May 1 during the time normally allocated to review sessions. The May 1 homework session will be given during the time normally allocated to lecture on May 4.

TA for the course: Jingling Guan, office hours, Tuesday 4-5pm, Leverone 445.

My office hours: Tuesday, 10-11am.

Course Outline

 

The first half of the course studies Time Series models and their properties. The second part discusses statistical inference about those models.

 

1.   Stochastic Processes (chapter 3)

·      Stationarity, ergodicity

·      White noise, AR, MA, ARMA models

·      Yule Walker equations

2.   Linear Projections (section 4.1, 4.2)

·      Necessity and sufficiency of orthogonality conditions

·      Recursive property of projections

·      Law of Iterated Projections

·      Projections versus regressions

3.   Wold decomposition theorem (page 109; Sargent)

4.   Vector autoregressions (sections 10.1-10.3)

5.   Spectral analysis (Chapter 6, Sargent and Christiano-Fitzgerald, Technical Appendix 1, page 71, and also.)

·      Lag operator notation

·      Autocovariance generating function (section 3.6)

·      Complex numbers, key integral property of complex numbers, inverse Fourier transform

·      Filters and band pass filter

·      Spectral density

·      Spectral decomposition theorem motivated according to band-pass filter approach in Sargent (Hamilton’s alternative approach is pursued in homework)

·      Spectral analysis and the solution to the projection problem.

6.   Kalman filter (sections 13.1, 13.2, 13.6)

·      State Space, Observer System

·      Derivation of the Kalman filter: ‘forward pass’, ‘backward pass’

7.   Maximum likelihood estimation (chapter 5)

·      Gaussian maximum likelihood

·      Conditional versus unconditional maximum likelihood

·      Identification of moving averages

·      Using the Kalman filter to build the Gaussian likelihood (section 13.4)

8.   Introduction to generalized method of moments (section 14.1)

·      Basic ideas from asymptotic sampling theory convergence in probability, mean square and distribution; law of large numbers; mean value theorem (chapter 7)

·      Basic setup of GMM: the ‘GMM orthogonality conditions’

·      Asymptotic sampling distribution of a sample mean and spectral density at frequency zero of GMM orthogonality condition

·      Optimal weighting matrix

·      Sampling properties of GMM estimator, hypothesis testing

9.   GMM and Maximum Likelihood (chapter 14)

·      Preliminary: martingale difference sequence

                                                           i.      definition

                                                       ii.      ARCH processes (chapter 21)

                                                   iii.      Some asymptotic results for m.d.s.

·      Demonstration that ML is a special case of GMM

                                                           i.      Using GMM and structure of Gaussian density to derive asymptotic sampling distribution of Gaussian ML

                                                       ii.      Using GMM without applying structure of the likelihood function to derive asymptotic sampling distribution of quasi-ML

·      Asymptotic efficiency of Gaussian ML (Cramer-Rao lower bound) (this discussion is not in the book, but a good background discussion can be found in an econometric text such as, e.g., Theil)

·      Robustness of GMM versus efficiency of ML: example of estimating ARCH processes (chapter 21)

10.                  Bayesian inference (chapter 12)

·      Bayes rule, priors, posteriors

·      Estimating a mean using iid observations with known variance, priors and ‘padding data’

·      Markov Chain, Monte Carlo Methods

                                                           i.      Approximating the posterior distribution

                                                       ii.      Approximating the marginal distribution and using it to evaluate model fit

11.                  Nonstationarity (chapter 18, 19)

·      Trend stationarity

                                                           i.      Experiment: impact on long-term forecast of a shock

                                                       ii.      Breakdown of standard covariance stationary sampling theory

1. some preliminary results concerning sums

2. superconsistency of OLS slope estimator

·        Unit roots

                                                           i.      Experiment: impact on long-term forecast of a shock

·      Distinction between unit roots and trend stationarity based on spectral density at frequency zero

·      Cointegration.

                                                           i.      Defined

                                                       ii.      The proper handling of cointegration in VARs

12.                  Very brief review of chapter 22 discussion of time series analysis with changes in regime

·      One approach to distinguishing whether recent change in volatility of macroeconomic and finance data reflects change in the transmission mechanism or heteroscedasticity in fundamental disturbances

·      Much data (especially for emerging market economies) ‘looks’ like it reflects periodic, large regime changes, suggesting the importance of chapter 22 methods