Time series econometrics emerged from early 20th-century efforts to analyze economic data ordered through time, initially grappling with the "Measurement without Theory" critique leveled by Koopmans against the empirical business-cycle work of Burns and Mitchell. This catalyzed a drive for greater probabilistic formalization. The foundational "Probability Approach to Econometrics" established by Haavelmo provided the critical framework, insisting that economic models must be conceived as probability distributions to enable proper statistical inference, thereby setting the stage for all subsequent time series methodology.
The mid-20th century was defined by the rivalry between the "Structural Econometrics" program, associated with the Cowles Commission, and the more pragmatic, data-driven "Box-Jenkins" methodology. Structural Econometrics aimed to estimate deep parameters from dynamic simultaneous-equations models derived from economic theory, facing formidable identification and estimation challenges. Concurrently, the Box-Jenkins approach, centered on ARIMA model identification, estimation, and forecasting, offered a powerful, theory-light toolkit that dominated applied forecasting for decades, representing a distinct methodological school focused on pattern extraction.
A paradigm shift occurred with the rise of the "Macroeconomic Vector Autoregression (VAR) School," championed by Sims as a critique of the "incredible" identifying restrictions in large-scale structural models. This school advocated for loosely restricted VARs to capture dynamic comovements and employed innovation accounting (like impulse responses) for interpretation. While initially a-theoretic, it evolved into a platform for Structural VAR (SVAR) analysis, which imposes theory-informed restrictions to recover economic shocks, creating a synthesis between time-series technique and structural inquiry.
The late 20th and early 21st centuries expanded the field along two major fronts. First, the development of formal frameworks for modeling non-stationary data, most notably cointegration and error-correction models associated with Engle, Granger, and others, became a central paradigm for analyzing long-run economic equilibria amid short-run dynamics. Second, "Bayesian Econometrics" gained substantial traction in time series, providing a coherent framework for incorporating prior information and handling complex dynamic models, such as those in state-space form, through computational methods like MCMC.
Contemporary time series econometrics is characterized by the active coexistence of these canonical schools. The Structural Econometrics tradition persists in modern dynamic stochastic general equilibrium (DSGE) estimation, while the VAR paradigm remains a workhorse for empirical macroeconomics. The field continues to integrate new challenges, such as modeling high-frequency data, large datasets, and structural breaks, within and across these established methodological traditions of probability, structure, dynamic specification, and inference.