Skip to main navigation menu Skip to main content Skip to site footer

The Empirical Revolution in Economics

Abstract

This article proposes an explanation to the empirical revolution in economics. It argues that the search for natural and quasi-natural experiments, wherever they were available, looking for more credible identification methods, led to the creation and application of new econometric tools and to their further propagation to an increasing number of fields in Economics. The application of the new tools and research designs to the evaluation of policy interventions all over the world activated a powerful feedback systems, working from interventions to their evaluations, by means of new econometric tools, to the publication of academic articles and back to the generation of new interventions. By using networks of cocitation and semantic networks of the articles that introduced the new tools, we found traces of their impact over the practice of economists, and over the emergence of three groupings of researchers as an effect of the arrival of synthetic control in 2003.

Keywords

Empirical Revolution, credibility, natural experiments, instrumental variables, econometrics

PDF (Español (España)) XML (Español (España))

References

Abadie, A., Diamond, A. & Hainmueller, J. (2010). Synthetic Control Methods for Comparative Case Studies: Estimating the Effect of California’s Tobacco Control Program. Journal of the American Statistical Association, 105(490), 493–505. https://doi.org/10.1198/jasa.2009.ap08746

Abadie, A. & Gardeazabal, J. (2003). The Economic Costs of Conflict: A Case Study of the Basque Country. American Economic Review, 93(1), 113–32. https://doi.org/10.1257/000282803321455188

Angrist, J. D. 1990. Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records. American Economic Review 80: 313-336.

Angrist, J. D. (2004). American Education Research Changes Tack. Oxford Review of Economic Policy, 20(2), 198-212. https://doi.org/10.1093/oxrep/grh011

Angrist, J., Azoulay, P., Ellison, G., Hill, R. & Feng Lu, S. (2017). Economic Research Evolves: Fields and Styles. American Economic Review: Papers and Proceedings, 107(5), 293-297. https://doi.org/10.1257/aer.p20171117

Angrist, J. D., Imbens, G. W. & Rubin, D. B. (1996). Identification of Causal Effects Using Instrumental Variables. Journal of the American Statistical Association, 91(434), 444–55. https://doi.org/10.2307/2291629

Angrist, J. D. & Krueger, A. (2001). Instrumental Variables and the Search for Identification. Journal of Economic Perspectives, 15, 69-86.

Angrist, J. D. & Lavy, V. (1999). Using Maimonides’ Rule to Estimate Effects of Class Size on Student Achievement. Quaterly Journal of Economics, 114(2), 533-75. https://doi.org/10.3386/w5888

Angrist, J. D. & Pischke, J. S. (2010). The Credibility Revolution in Empirical Economic: How Better Research Design is Taking the Con out of Econometrics. Journal of Economic Perspectives, 24(2), 3-30. https://doi.org/10.1257/jep.24.2.3

Arendt, H. (1960). On Revolution. New York: Penguin.

Ashenfelter, O. (1974). The Effect of Manpower Training on Earnings: Preliminary Results. Princeton, NJ: Princeton University Industrial Relations Section, Working Paper No 60.

Ashenfelter, O. (1987). The Case for Evaluating Training Programs with Randomized Trials, Economics of Education Review, 6(4): 333-338. https://doi.org/10.1016/0272-7757(87)90016-1

Ashenfelter, O. (2014). The Early History of Program Evaluation and the Department of Labor. Industry and Labor Review, 87(Suppl.), 374-377.


Ashenfelter, O. & Card, D. (2017). Introduction to “Essays in Honor of Robert J. LaLonde”. Princeton, NJ: Princeton University Industrial Relations Section, Working Paper No 610.

Ashenfelter, O. & Heckman, J. (1974). Measuring the Effect of an Antidiscrimination Program. Stanford, CA: NBER, Center of the Economic Analysis of Human Behavior and Social Institutions.

Athey, S. & Imbens, G. W. (2017). The State of Applied Econometrics: Causality and Policy Evaluation. Journal of Economic Perspectives, 31(2), 3-32. https://doi.org/10.1257/jep.31.2.3

Backhouse, R. E. & Cherrier, B. (2017). The Age of the Applied Economist: The Transformation of Economics since 1970. History of Political Economy, 49 (Supplement): 1-33. https://doi.org/10.1215/00182702-4166239

Biagoli, M. (1993). Galileo Courtier. The Practice of Science in the Culture of Absolutism. Chicago: Chicago University Press. https://doi.org/10.1086/ahr/99.2.505

Blondel, V., Guillaume, J. L. Lambiote, R. & Lefebvre, E. (2008). Fast Unfolding of Communities in Large Networks. Journal of Statistical Mechanics: Theory and Experiment, 10 (P10008). DOI: 10.1088/1742-5468/2008/10/P10008. https://doi.org/10.1088/1742-5468/2008/10/p10008

Card, D. (1990). The Impact of the Mariel Boatlift on the Miami Labor Market. Industrial and Labor Relations Review, 43(2), 245–57. https://doi.org/10.2307/2523702

Card, D. & Krueger, A. B. (1992). Does School Quality Matter? Returns to Education and the Characteristics of Public Schools in the United States. Journal of Political Economy, 100 (1), 1-40. https://doi.org/10.1086/261805

Deaton, A. (2009). Instruments of Development: Randomization in the Tropic, and the Search for the Elusive Keys to Economic Development. Cambridge, MA: National Bureau of Economic Research Working Paper 14960. https://doi.org/10.3386/w14690

Deaton, A. & Cartwright, N. (2016). Understanding and Misunderstanding Randomized Control Trials. Durham, England: Durham University CHESS Working Paper 2016-05.

Dizikes, P. (2013, January 2). The Natural Experimenter. MIT Technology Review. Retrieved from http://www.technology review.com/article/508381/the-natural-experimenter/

Duflo, E. (2017). Richard D. Ely Lecture: The Economist as Plumber. American Economic Review Papers and Proceedings, 107(5), 1-26.

Erickson, P. (2015). The World the Game Theorists Made. Chicago: University of Chicago Press.

Griffiths, T. L. & Steyvers, M. (2002). A Probabilistic Approach to Semantic Representation. In Proceedings of the Twenty-Fourth Annual Conference of Cognitive Science Society (pp. 381-386). Hillsdale, NJ: Lawrence Erlbaum. https://doi.org/10.4324/9781315782379-102

Griffiths, T. L. & Steyvers, M. (2004). Finding Scientific Topics. PNAS, 101(1), 5228-5235.

Hammersh, D. S. (2013). Six Decades of Top Economics Publishing: Who and How? Journal of Economic Literature, 51(1), 162-172. https://doi.org/10.1257/jel.51.1.162

Heckman, J. J. & Urzúa, S. (2010). Comparing IV with Structural Methods: What Simple IV Can and Cannot Identify. Journal of Econometrics, 156(1), 27-37. https://doi.org/10.1016/j.jeconom.2009.09.006

Hendry, D. F. 1980. “Econometrics— Alchemy or Science?” Economica 47(188): 387–406. https://doi.org/10.2307/2553385

Imbens, G.W. (2010). Better LATE than Nothing: Some Comments on Heckman and Urzua (2009). Journal of Economic Literature, 48, 399-423. https://doi.org/10.1257/jel.48.2.399

Imbens, G. W. & Angrist, J. D. (1994). Identification and Estimation of Local Average Treatment Effects. Econometrica, 62(2), 467–75. https://doi.org/10.2307/2951620

Imbens, G. W., D. Rubin, and B. I. Sacerdote. (2001). Estimating the Effect of Unearned Income on Labor Earnings, Savings and Consumption: Evidence from a Survey of Lottery Players. American Economic Review 91(4), 778-794. https://doi.org/10.3386/w7001

Imbens, G. W. & Lemieux, T. (2008). Regression Discontinuity Designs: A Guide to Practice. Journal of Econometrics, 142(2), 615-635. https://doi.org/10.1016/j.jeconom.2007.05.001

Jacob, B. A. & Lefgren, L. (2004). Remedial Education and Student Achievement: A Regression-Discontinuity Analysis. Review of Economics and Statistics, 86(1), 226–44. https://doi.org/10.1162/003465304323023778

Keane, M. P. (2010). A Structural Perspective on the Experimentalist School. Journal of Economic Perspectives, 24(2), 47-58. https://doi.org/10.1257/jep.24.2.47

Kleinberg, J. (1999). Authoritative Sources in a Hyperlinked Environment. JACM, 46(5), 604-632. https://doi.org/10.1145/324133.324140

Kuhn, T. S. (2000). The Road since Structure. Chicago: Chicago University Press.

LaLonde, R. J. (1986). Evaluating the Econometric Evaluation of Training Programs with Experimental Data. American Economic Review, 76(4), 604-20.

Leamer, E. E. (1983). Let’s Take the Con Out of Econometrics. American Economic Review, 73(1), 31-43.

Leamer, E. (2010). Tantalus on the Road to Asymptopia. Journal of Economic Perspectives, 34(2), 31-46. https://doi.org/10.1257/jep.24.2.31

Meyer, B. D. (1995). Natural and Quasi-Natural Experiments in Economics. Journal of Business and Economic Statistics, 13(2), 151-161. https://doi.org/10.2307/1392369

Salazar, B. & Otero, D. (2019). A Tale of Tool: The Impact of Sims’s Vector Autoregressions on Econometrics. History of Political Economy, 51(3), 557-578.

Small, H. (1973). Co-citation in the Scientific Literature: A New Measure of the Relationship Between Two Documents. Journal of the American Society for Information Science, 24(4), 265-269. https://doi.org/10.1002/asi.4630240406

Small, H. (1977). A Co-Citation of a Scientific Specialty: A Longitudinal Study of Collagen Research. Social Studies of Science, 7, 139-66.

Small, H. (1980). Co-citation Context Analysis and the Structure of Paradigms. Journal of Documentation, 36(3), 183-196. https://doi.org/10.1108/eb026695

Small, H. (1999, December/January). On the Shoulders of Giants. Bulletin of the American Society for the Information Science, 23-25.

Sims, C.A. (1980). Macroeconomics and Reality. Econometrica 48:1–48.

Downloads

Download data is not yet available.

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.