Bayesian econometrics emerged as a distinct methodological school within econometrics during the mid-20th century, fundamentally challenging the dominant frequentist paradigm. Its foundational framework was established by pioneers such as Arnold Zellner, who articulated a comprehensive Bayesian approach to econometric inference, emphasizing the coherent updating of prior beliefs with data through Bayes' theorem. This early Bayesian econometrics school positioned itself against the "Probability Approach to Econometrics" of the frequentist tradition, advocating for the integration of economic theory via informative priors and offering a unified framework for estimation, hypothesis testing, and prediction. Key theoretical contributions included Bayesian treatment of simultaneous equations models and linear regression, setting the stage for a rival inference philosophy.
The development of Bayesian structural econometrics represented a major evolution, applying Bayesian methods to estimate and evaluate structural economic models derived from microeconomic or macroeconomic theory. This framework enabled researchers to incorporate prior information on deep parameters and manage identification challenges in causal models, bridging the gap between economic theory and empirical analysis. It flourished in areas like industrial organization and labor economics, where models often involve complex latent variables and equilibrium conditions. Concurrently, the Bayesian time series econometrics school gained prominence, particularly through the Bayesian vector autoregression (VAR) approach for macroeconomic forecasting and policy analysis, championed by economists such as Christopher Sims as a tool for handling model uncertainty and incorporating prior beliefs about dynamics.
A transformative shift occurred with the advent of computational Bayesian econometrics, driven by the adoption of Markov Chain Monte Carlo (MCMC) simulation methods like Gibbs sampling and the Metropolis-Hastings algorithm. This computational school made previously intractable high-dimensional models feasible, catalyzing widespread application across economics. It facilitated the rise of Bayesian hierarchical models, state-space models, and models with non-standard distributions, effectively democratizing Bayesian methods and embedding them in standard econometric practice. This era also saw the maturation of Bayesian microeconometrics, applying these tools to discrete choice, limited dependent variable, and panel data models, often with a focus on flexible specification and robust inference.
In contemporary practice, Bayesian econometrics continues to evolve through frameworks like Bayesian model averaging for addressing model uncertainty, nonparametric and semiparametric Bayesian methods for greater flexibility, and integration with machine learning techniques. Debates persist regarding prior specification, computational efficiency, and the role of subjectivity, maintaining its identity as a live paradigm alongside frequentist and design-based approaches. The subfield remains defined by its core commitment to probabilistic learning from data, with ongoing innovations ensuring its central role in modern empirical economics.