Send me an e-mail. In the subject write: ECO 523 Mailing List. There is not need to write anything. I will just use the subject to create a filter and add your e-mail to the course mailing list.
Class plan, material and important dates
09/13 (Friday): Kernel based techniques
Class material: introduction, density estimation, kernel regression and how to program it, asymptotic theorems. Download the handout here. Download homework 1.
09/20 (Friday): Kernel based techniques
Class material: bandwidth selection, bootstrap, local polynomial and how to program it, derivative estimation and asymptotic theorems. Download the handout here.
09/27 (Friday): Series based techniques
Problem set 1 due on this class. Class material: series regression and how to program it, choice of basis, power, Fourier and B-spline bases. Download the handout here. Download homework 2.
10/04 (Friday): Series based techniques
Material: Wavelets bases. Choice of basis size. Asymptotic theorems and variance estimation. Download the handout here.
10/11 (Friday): Semi-parametric models
Problem set 2 due on this class. Class material: partially linear models, additive models, varying coefficients models, and single-index models. Download the handout here. Download homework 3.
10/18 (Friday): Topics
Class material: inverse probability weighted treatment effects, regression discontinuity design and nonparametric IV. Download the handout here.
10/25 (Friday): Final exam
Problem set 3 due on the Final exam day.
About this course
This course covers the basics of nonparametric estimation. It intends to be more comprehensive than deep. The goal is to present the two main strategies in nonparametric estimation (kernel based and series based), which is rarely done in the same course. In order to achieve this, we will not be going into the details of the proofs that derive the asymptotic behavior of these estimators. Rather, we will focus on understanding how each method works, its strengths and weaknesses, and how to implement them in real data.
Nonparametric methods are sometimes compared to parametric approaches and found to be harder and less precise. This is not true. It is unfair to compare nonparametric methods to parametric methods, because the latter assume that a parametric structure is known. Nonparametric methods are simply realistic about our lack of knowledge about the data generating process, and if they seem less precise it is because we are trying to estimate the right function. The adoption of a parametric approach with a mistakenly specified parametric assumption will yield wrong estimates, and no amount of data can correct this. The approach will be unsuccessful in unearthing the true effects of interest, and this is often a worse form of imprecision.
In this course, you will also see that the basic nonparametric estimators can be seen as estimators on the least squares family. They can be understood inside of a parametric context, and are real contenders in that class. Reversely, the common parametric methods such as OLS can be seen as nonparametric estimators (poorly performing ones, by the way). Unfortunately, in general, typical parametric estimators perform worse than those that were designed to capture the local behavior of unknown structures.
We will be covering a wide array of estimators that are better suited to capture the nuances of an unknown function. These methods are of a more local nature than the OLS, GMM, 2SLS, etc. Kernel methods are the most local of all, particularly the local polynomial, which performs well even at boundary points. Series methods have great computational advantages, and there are bases that can capture local behavior with flair. Among the series methods, spline bases have wonderful local properties, and wavelets go as far as estimating discontinuous functions, all in one single regression.
We will cover all of these methods, and also very flexible structures that allow us to combine the strengths of both parametric and nonparametric approaches, such as partially linear models and separably additive models. Finally, we will cover some of the most interesting recent developments in nonparametric estimation, such as the regression discontinuity design, the inverse probability weighted propensity score and nonparametric instrumental variable estimation.
Your grade will reflect mostly your performance in the Final exam. However, I will make adjustments conditional on your problem sets. This means that I will take in consideration whether you handed the problem sets on the right dates, and attempted all the questions. In such cases I will bump your grade up in consideration of your effort, and I will be particularly generous with those students that work seriously on the applied part. Irrespective of the grade, you should work on the problem sets. There is no way to learn this material if you don’t do the applied exercises, and what is the point of taking this course anyway if you don’t try to learn?
About the field exam
The material on this course is mandatory for the Econometrics field exam, and is worth 20% of the exam grade.
About this website
The classes and the problem sets are already published, so you can use them as reference. However, I might make changes as the course progresses. For example, if there is a mistake in the notes, I will update them as I find out. Also, I will add new questions to the problem sets. Hence, I suggest that you download an updated version of the notes closer to the Final exam, and always use the problem set version published on the date that has the “Begin working on problem set ...” phrase in the class plan.