Complexity theory emerged in the 1960s from the foundations of computability theory, shifting focus from what is computable to what is computable efficiently. The classical deterministic paradigm, established by Hartmanis and Stearns, defined core time and space complexity classes such as P, NP, and PSPACE using resource-bounded Turing machines. This framework centered on hierarchy theorems and the classification of decision problems by their inherent difficulty, setting the stage for the field's central agendas. Early structural complexity theory began to explore relationships among these classes through basic reductions and separations, forming an initial spine for analyzing computational resources.
The 1970s solidified the NP-completeness framework, pioneered by Cook and Karp, which provided a robust method for proving the intractability of numerous practical problems and made the P versus NP question the field's defining open problem. This era saw the rise of structural complexity as a durable school, deepening the study of class relationships via polynomial-time reductions, oracle constructions, and relativization. These approaches emphasized the comparative power of deterministic, nondeterministic, and space-bounded computation, establishing a cohesive agenda for understanding the lattice of complexity classes.
In the 1980s and 1990s, the field expanded with new paradigms. Randomized complexity introduced classes like BPP, RP, and ZPP, systematically exploring the power of probabilistic algorithms and randomness. Interactive proof systems emerged as another major framework, culminating in the IP=PSPACE theorem and the PCP theorem, which revolutionized verification and hardness of approximation. Circuit complexity gained prominence as a parallel school, aiming to prove lower bounds on Boolean circuits to separate classical complexity classes. These agendas diversified the tools and perspectives for assessing computational hardness.
The advent of quantum computing in the 1990s spurred quantum complexity theory, defining classes such as BQP and QMA and investigating quantum supremacy. This paradigm introduced distinct assumptions about quantum resources and algorithms. Concurrently, descriptive complexity developed, linking complexity classes to logical definability, and parameterized complexity offered a refined framework for tackling NP-hard problems by isolating structural parameters. These schools represent enduring curricula with sustained research footprints.
Modern complexity theory continues to evolve with frameworks like average-case complexity and fine-grained complexity, but the core paradigms—classical deterministic, structural, randomized, interactive, circuit-based, quantum, descriptive, and parameterized—remain the durable technical agendas that define the subfield's history and ongoing discourse.