The subfield of signals and systems in electrical engineering centers on the representation, analysis, and processing of signals—time-varying quantities conveying information—and the systems that manipulate them. Central questions include: How can signals be modeled efficiently? What mathematical frameworks best describe system behavior? How can noise be mitigated or information extracted? The historical evolution is marked by rival methodological paradigms, each with distinct assumptions, driving transitions from analog to digital, deterministic to stochastic, and time-domain to frequency-domain approaches.
Early foundations in the late 19th and early 20th centuries were dominated by the Analog Signal Processing (ASP) paradigm, rooted in continuous-time mathematics. This school relied on differential equations, Laplace transforms, and Fourier series to analyze linear time-invariant (LTI) systems, enabling the design of analog filters and communication systems. Key figures like Oliver Heaviside and Harry Nyquist contributed to operational calculus and sampling insights, but the paradigm assumed ideal continuous signals, limiting handling of noise and complexity.
The mid-20th century saw a pivotal transition with the rise of the Digital Signal Processing (DSP) paradigm, spurred by Claude Shannon's information theory and the sampling theorem. DSP introduced discrete-time representations using difference equations, z-transforms, and numerical algorithms, allowing precise, programmable signal manipulation. The Cooley-Tukey Fast Fourier Transform (FFT) in the 1960s made frequency-domain analysis computationally feasible, cementing DSP as a durable rival to ASP. This shift encoded a fundamental assumption: signals could be sampled and processed digitally without significant loss, enabling advancements in telecommunications, audio, and imaging.
Concurrently, rival analytical schools emerged. The Time-Domain Analysis school, focusing on convolution, impulse response, and state-space methods, contrasted with the Frequency-Domain Analysis school, which emphasized Fourier transforms, spectral density, and filter design in the frequency realm. These approaches often competed in education and application, with time-domain methods favored for transient analysis and frequency-domain for steady-state and filter synthesis. Another major divide was between Deterministic Signal Processing, assuming known signal models, and the Statistical Signal Processing (SSP) paradigm, which incorporated probability theory to handle random processes, noise, and uncertainty. SSP introduced techniques like Wiener filtering and Kalman filtering, becoming essential for radar, control, and communications.
From the 1970s onward, further paradigm diversification occurred. The Adaptive Signal Processing (AdSP) school developed algorithms that adjust system parameters in real-time, such as the least mean squares (LMS) filter, challenging static design assumptions. The Multirate Signal Processing (MSP) paradigm addressed signals at varying sampling rates, enabling efficient compression and transmission. Additionally, Nonlinear Signal Processing methods emerged to model systems beyond LTI constraints, though linear approximations often remained dominant due to tractability.
In the late 20th century, Time-Frequency Analysis (TFA) schools like wavelet transforms offered multi-resolution insights, bridging time and frequency domains and rivaling traditional Fourier methods. The Compressed Sensing (CS) paradigm, arising in the 2000s, challenged Nyquist sampling assumptions by enabling signal recovery from fewer samples under sparsity conditions, representing a modern rival to conventional DSP.
The current landscape integrates these canonical schools while embracing new influences. Traditional paradigms like DSP and SSP remain foundational, but machine learning approaches are introducing data-driven models that sometimes compete with analytical methods. However, core rivalries persist—e.g., model-based versus data-driven processing, or analog versus digital implementations in low-power contexts. The subfield continues to evolve through synthesis, with educational curricula emphasizing paradigm contrasts to foster innovation. Overall, signals and systems history is a narrative of competing assumptions, from analog continuity to digital discretization, and from deterministic certainty to statistical inference, shaping modern technology.