GENERAL SEMINAR IN Finance
Basic Time Series Analysis
The purpose of the course is to introduce, primarily at an intuitive level, the basic concepts of time series analysis, as applied in finance and economics. Topics will include:
· Basic concepts and models: ergodicity, stationarity, ARMA models, Vector Autoregressions, ARCH/GARCH
· Foundational results for time series models: Wold and Spectral Decomposition Theorems
· Filtering tools: Kalman Filter and other filters such as the band pass filter and the Hodrick-Prescott filter
· Statistical Inference:
o Classical asymptotic theory: laws of large numbers; central limit theorems; sampling results for Generalized Method of Moments and Maximum Likelihood.
o Bayesian inference: priors and posteriors; the marginal likelihood as a way to assess model fit; Monte Carlo Markov Chain and Gibbs sampling; use of Bayesian methods to achieve parameter parsimony in forecasting models, and for estimating structural models.
· Nonstationarity in economic time series: unit roots, determination of cointegration rank, deterministic trends.
· Identification of impulse response functions.
The textbook is Hamilton, Time Series Analysis. Chapter 11 in Sargent’s Macroeconomic Theory is also very useful.
Midterm and final from last year:
The course meets 2:00-3:15 Mondays, and 4:00-5:45 on Wednesdays, in room 4214, Leverone.
Grades will be based on a midterm (30%), homeworks (30%) and a final (40%).
The midterm will be Wednesday, April 30.
There was no class on March 31 and there will also be no class on April 28. These will be made up by taking two Friday homework sessions.
TA for the course: Jingling Guan, office hours, Tuesday 4-5pm, Leverone 448.
My office hours: Tuesday, 10-11am.
Homework #1, due April 16.
Homework #3, due April 30.
Homework #4, due May 12 (the due date for this homework has been pushed back).
, due May 16
Homework #6, due May 22.
Homework #7, due June 1.
The first half of the course studies Time Series models and their properties. The second part discusses statistical inference about those models.
1. Stochastic Processes (chapter 3)
· Stationarity, ergodicity
· White noise, AR, MA, ARMA models
· Yule Walker equations
2. Linear Projections (section 4.1, 4.2)
· Necessity and sufficiency of orthogonality conditions
· Recursive property of projections
· Law of Iterated Projections
· Projections versus regressions
3. Wold decomposition theorem (page 109; Sargent)
4. Vector autoregressions (sections 10.1-10.3)
5. Spectral analysis (Chapter 6, Sargent and Christiano-Fitzgerald, Technical Appendix 1, page 71, and also.)
· Lag operator notation
· Autocovariance generating function (section 3.6)
· Complex numbers, key integral property of complex numbers, inverse Fourier transform
· Filters and band pass filter
· Spectral density
decomposition theorem motivated according to band-pass filter approach in
1. Kalman filter (sections 13.1, 13.2, 13.6)
· State Space, Observer System
· Derivation of the Kalman filter: ‘forward pass’, ‘backward pass’
2. Maximum likelihood estimation (chapter 5)
· Gaussian maximum likelihood
· Conditional versus unconditional maximum likelihood
· Identification of moving averages
· Using the Kalman filter to build the Gaussian likelihood (section 13.4)
3. Introduction to generalized method of moments (section 14.1)
· Basic ideas from asymptotic sampling theory convergence in probability, mean square and distribution; law of large numbers; mean value theorem (chapter 7)
· Basic setup of GMM: the ‘GMM orthogonality conditions’
· Asymptotic sampling distribution of a sample mean and spectral density at frequency zero of GMM orthogonality condition
· Optimal weighting matrix
· Sampling properties of GMM estimator, hypothesis testing
4. GMM and Maximum Likelihood (chapter 14)
· Preliminary: martingale difference sequence
ii. ARCH processes (chapter 21)
iii. Some asymptotic results for m.d.s.
· Demonstration that ML is a special case of GMM
i. Using GMM and structure of Gaussian density to derive asymptotic sampling distribution of Gaussian ML
ii. Using GMM without applying structure of the likelihood function to derive asymptotic sampling distribution of quasi-ML
· Asymptotic efficiency of Gaussian ML (Cramer-Rao lower bound) (this discussion is not in the book, but a good background discussion can be found in an econometric text such as, e.g., Theil)
· Robustness of GMM versus efficiency of ML: example of estimating ARCH processes (chapter 21)
5. Bayesian inference (chapter 12)
· Bayes rule, priors, posteriors
· Estimating a mean using iid observations with known variance, priors and ‘padding data’
· Markov Chain, Monte Carlo Methods
i. Approximating the posterior distribution
ii. Approximating the marginal distribution and using it to evaluate model fit
6. Nonstationarity (chapter 18, 19)
· Trend stationarity
i. Experiment: impact on long-term forecast of a shock
ii. Breakdown of standard covariance stationary sampling theory
1. some preliminary results concerning sums
2. superconsistency of OLS slope estimator
· Unit roots
i. Experiment: impact on long-term forecast of a shock
· Distinction between unit roots and trend stationarity based on spectral density at frequency zero
ii. The proper handling of cointegration in VARs
7. Very brief review of chapter 22 discussion of time series analysis with changes in regime
· One approach to distinguishing whether recent change in volatility of macroeconomic and finance data reflects change in the transmission mechanism or heteroscedasticity in fundamental disturbances
· Much data (especially for emerging market economies) ‘looks’ like it reflects periodic, large regime changes, suggesting the importance of chapter 22 methods