Professional Documents
Culture Documents
Identification and Estimation of Causal Effects in Economics and other Social Sciences
Syllabus
Stanislao Maldonado
Department of Agricultural and Resource Economics
University of California at Berkeley
I. Overview
This course discusses conceptual and technical issues related to the identification and
estimation of causal effects in economics and other social sciences. The course is designed to
introduce modern econometric techniques for applied researchers and includes an important
empirical portion devoted to the implementation of the estimators discussed in class using real
data. In addition, empirical papers will be discussed in order to get a better understanding of
the econometric techniques introduced in class. Consequently, the emphasis is more on research
design and applications than theoretical proofs, although some of the former will be discussed.
Familiarity with basic econometrics is assumed. The readings for the course are available in the
following webpage:
http://are.berkeley.edu/~stanislao/
II. Program
There is no a single text for the course, but some parts of the following ones will be used
extensively. The reading list is organized using abbreviations that follow each text:
Introductory level:
Stock, James and Mark Watson (2007). Introduction to Econometrics. Pearson/Addison
Wesley. (SW)
Intermediate/Advanced:
Cameron, A. Colin and Pravin Trivedi (2005). Microeconometrics. Cambridge University
Press. (CT)
2
Wooldridge, Jeffrey (2001). Econometric Analysis of Cross-section and Panel Data. MIT Press.
(JW)
Lee, Myoung-Jae (2005). Micro-Econometrics for Policy, Program, and Treatment Effects. Oxford
University Press. (ML)
Millimet, Daniel; Jeffrey Smith and Edward Vytlacil (2008). Modelling and Evaluating
Treatment Effects in Econometrics. Advances in Econometrics Vol. 21. Elservier Ltd. (MSV)
Basic readings:
3
Holland, P. W. (1986). “Statistics and Causal Inference”, Journal of the American Statistical
Association, 81, 945-970. Read also comments to the article.
Heckman, James (2000). “Causal Parameters and Policy Analysis in Economics: A Twentieth
Century Perspective”, Quarterly Journal of Economics, 115, 45-97.
Hoover, Kevin (2006). “Causality in Economics and Econometrics”. Paper prepared for the
New Palgrave Dictionary of Economics.
CT, Chapter 2.
MW, Chapter 2.
Supplementary readings:
Heckman, James (2005). “A Scientific Model of Causality”, Sociological Methodology, 35, 1-
150.
JP, Chapter 5.
Holmes, Thomas (2009). “Structural, Experimentalist and Descriptive Approaches to
Empirical Work in Regional Economics”. Mimeo.
Basic readings:
Duflo, Esther; Rachel Glannester and Micheal Kremer (2008). “Using Randomization in
Economic Development Research: A Toolkit”. Handbook of Development Economics, Vol. 4,
Elservier Science.
Banerjee, Abhijit and Esther Duflo. “The Experimental Approach to Development
Economics”. NBER Working Paper 14467.
SW, Chapter 13.
AP, Chapter 2.
Supplementary readings:
LaLonde, Robert (1986). “Evaluating the Econometric Evaluations of Training Programs
with Experimental Data”, American Economic Review, 76, 604-620.
Heckman, James (1995). “Assessing the Case for Social Experiments”, Journal of Economic
Perspectives, 9, 85-110.
Bruhn, Miriam and David McKenzie (2008). "In Pursuit of Balance." World Bank Policy
Research Working Paper 4752.
SSC, Chapter 1.
Examples:
Schultz, T. Paul (2004). “School Subsidies for the Poor: Evaluating the Mexican Progresa
Poverty Program”. Journal of Development Economics. June, 199-250.
Olken, Ben (2007). “Monitoring Corruption: Evidence from a Field Experiment in
Indonesia.” Journal of Political Economy, 115, 200-249.
4
Chattopadhyay, Raghabendra and Esther Duflo (2004). “Women as Policy Makers: Evidence
from a Randomized Policy Experiment in India.”Econometrica, 72, 1409-1443.
Gerber, Alan S., and Donald P. Green (2000). “The Effects of Canvassing, Direct Mail, and
Telephone Contact on Voter Turnout: A Field Experiment”. American Political Science Review,
94, 653-63.
3. Non-experimental Designs.
Basic readings:
AP, Chapter 3.1-3.2.
MW, Chapter 5.
CT, Sections 4.1-4.4.
JW, Chapter 4.
Examples:
Krueger, Alan (1993). “How Computers Have Changed the Wage Structure: Evidence
from Microdata, 1984-1989” Quarterly Journal of Economics, 108, 33-60.
DiNardo, John E and Pischke, Jorn-Steffen (1997). "The Returns to Computer Use
Revisited: Have Pencils Changed the Wage Structure Too?" The Quarterly Journal of
Economics, 112, 291-303.
Basic readings:
AP, Chapter 3.1-3.2.
MW, Chapter 4.
DR, Chapters 10 and 12.
ML, Chapter 4.
Imbens, Guido (2004). “Nonparametric Estimation of Average Treatment Effects Under
Exogeneity: A Review”, Review of Economics and Statistics, 86, 4-29.
Supplementary readings:
Dehejia, Rajeev and Sadek Wahba (1999). “Causal Effects in Non-Experimental Studies:
Reevaluating the Evaluation of Training Programs”, Journal of the American Statistical
Association, 94, 1053-1062.
Smith, Jeffrey and Petra Todd (2005). “Does Matching Overcome LaLonde’s Critique of
Non-experimental Methods?”, Journal of Econometrics, 125, 305-353.
5
Examples:
Jalan, Jyotsna and Martin Ravallion (2003). “Does Piped Water Reduce Diarrhea for
Children in Rural India?”. Journal of Econometrics. January, 153-173.
Gilligan, Michael J. and Ernest J. Sergenti (2008) "Do UN Interventions Cause Peace?
Using Matching to Improve Causal Inference", Quarterly Journal of Political Science, 3, 89-
122.
Persson, Torsten and Tabellini, Guido (2007). “The Growth Effect of Democracy: Is It
Heterogenous and How Can it Be Estimated?”. NBER Working Paper 14723.
Basic readings:
AP, Chapter 5.
CT, Chapter 21.
JW, Chapter 10.
Meyer, Bruce (1995). “Natural and Quasi‐Natural Experiments in Economics”, Journal of
Business and Economic Statistics, 13, 151-161.
Supplementary readings:
ML, Sections 4.5-4.6.
Rosenzweig, Mark and Kenneth Wolpin (2000). “Natural "Natural Experiments" in
Economics”. Journal of Economic Literature, 38, 827-874.
Bertrand, Marianne; Esther Duflo and Sendhil Mullainathan (2004). “How Much Should
We Trust Differences-in-Differences Estimates?” Quarterly Journal of Economics, 119, 249-
275.
Examples:
Card, D. and A. Krueger (1994). “Minimum Wages and Employment: A Case Study of
the Fast Food Industry”, American Economic Review, 84, 772-793.
Galiani, Sebastian, Gertler, Paul J. and Schargrodsky, Ernesto (2005). “Water for Life:
The Impact of the Privatization of Water Services on Child Mortality”. Journal of Political
Economy, 113, pp. 83-120.
Levitt, Steven (1994). "Using Repeat Challengers to Estimate the Effect of Campaign
Spending on Election Outcomes in the U.S. House." Journal of Political Economy, 102, 777-
98.
6
Di Tella, Rafael, and Ernesto Schargrodsky (2004). "Do Police Reduce Crime? Estimates
Using the Allocation of Police Forces after a Terrorist Attack." American Economic Review,
94, 115–133.
Di Tella, Rafael, Sebastian Galliani, and Ernesto Schargrodsky (2007). “The Formation of
Beliefs: Evidence from the Allocation of Land Titles to Squatters”. Quarterly Journal of
Economics, 122, 209–41.
Basic readings:
JW, Chapter 5.
AP, Chapter 4.
MW, Chapter 7.
Angrist, Joshua; Guido Imbens and Donald Rubin (1996). “Identification of Causal
Effects Using Instrumental Variables”, Journal of the American Statistical Association, 91,
444-455.
Heckman, James (1997). “Instrumental Variables: A Study of Implicit Behavioral
Assumptions in One Widely Used Estimator”. Journal of Human Resources, 32, 441-462.
Supplementary readings:
Imbens, Guido W. and Joshua Angrist (1994). “Identification and Estimation of Local
Average Treatment Effects”, Econometrica, 62, 467-475.
Angrist, Joshua (2004). “Treatment Effect Heterogeneity in Theory and Practice”,
Economic Journal, 114, C52-C83.
Bound, J., D. Jaeger, and R. Baker (1995). “Problems with Instrumental Variables
Estimation when the Correlation between the Instruments and the Endogenous
Explanatory Variables is Weak”, Journal of the American Statistical Association, 90, 443–450.
Examples:
Angrist, Joshua (1990). "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence
from Social Security Administrative Records," American Economic Review, 80, 313-336.
Miguel, Edward; S. Satyanath and E. Sergenti (2004). “Economic Shocks and Civil
Conflict: An Instrumental Variables Approach”, Journal of Political Economy, 112, 725-753.
Acemoglu, Daron, Simon Johnson and James Robinson (2001). “The Colonial Origins of
Comparative Development: An Empirical Investigation”, American Economic Review, 91,
1369-1401.
Chay, Kenneth and Michael Greenstone (2005). “Does Air Quality Matter? Evidence
from the Housing Market”, Journal of Political Economy, 133, 376-424.
Basic readings:
CT, Section 25.6
AP, Chapter 6.
Hahn, J., P. Todd, and W. van der Klaauw (2001). “Estimation of Treatment Effects with
a Regression-Discontinuity Design”, Econometrica, 69, 201-209.
Imbens, Guido and Thomas Lemieux (2008). “Regression Discontinuity Designs: A
Guide to Practice”, Journal of Econometrics, 142, 615-635.
Supplementary readings:
Lee, David and Thomas Lemieux (2009). “Regression Discontinuity Design in
Economics”. NBER Working Paper 14723.
Examples:
Manacorda, Marco, Edward Miguel, and Andrea Vigorito (2009). “Government
Transfers and Political Support”, unpublished working paper.
Angrist, Joshua and Victor Lavy (1999). “Using Maimonides' Rule to Estimate the Effect
of Class Size on Scholastic Achievement”, Quarterly Journal of Economics, 114, 533-575.
Lee, David; Enrico Moretti and Matthew Butler (2004). "Do Voters Affect or Elect
Policies? Evidence from the U.S. House", Quarterly Journal of Economics, 119, 807–859.
Dell, Melissa (2008). “The Persistent Effects of Peru's Mining Mita”, Mimeo.
Pettersson-Lidbom, Per and Björn Tyrefors (2008). “The Policy Consequences of Direct
versus Representative Democracy: A Regression Discontinuity Approach”, Mimeo.
4. Other issues.
Basic readings:
SCC, Chapter 2 and 3.
8
Roe, Brian and David Just (2009). “Internal and External Validity in Economics
Research: Tradeoffs between Experiments, Field Experiments, Natural Experiments and
Field Data”. Forthcoming in the American Journal of Agricultural Economics.
Examples:
Miguel, Edward and Micheal Kremer (2004). “Worms: Identifying Impacts on Education
and Health in the Presence of Treatment Externalities”, Econometrica, 72, 159-217.