Cognitive HCI emerged as a foundational subfield, applying theories from cognitive science to understand and design for human interaction with computing systems. Its initial paradigm was firmly rooted in Information-Processing Psychology, treating the human user as a rational processor with limited working memory and attention. This school dominated early HCI, focusing on modeling task performance, optimizing menu structures, and reducing cognitive load through principles like consistency and mapping. The dominant framework was the Model Human Processor and its derivatives, which provided a computational account of perceptual, cognitive, and motor operations to predict interaction times and errors. This established a durable agenda of engineering interfaces to fit a quantifiable, internal user model.
A significant paradigm shift occurred with the rise of Situated and Distributed Cognition. Reacting against the decontextualized nature of pure information-processing models, this school argued that cognition is not merely inside the head but is distributed across people, artifacts, and environments. Influenced by activity theory and ethnographic methods, it shifted focus from abstract task analysis to understanding how tools are used in real-world practice. The framework of Distributed Cognition became central, analyzing how information is transformed and propagated across human and technological systems. This expanded the unit of analysis from the individual user to the entire socio-technical system, prioritizing context, embodied action, and collaborative work.
The field subsequently integrated more strongly with embodied and experiential perspectives, leading to the Embodied Interaction paradigm. Drawing from phenomenology and tangible computing, this school posits that meaning and understanding arise from our physical and social engagement with the world. It moved beyond the screen-based, representational focus to design for direct manipulation, tangible interfaces, and whole-body interaction. This paradigm treats cognition as inherently Embodied, Embedded, and Enactive, arguing that interaction is a form of meaning-making shaped by our physicality and the affordances of interactive materials. It provided a theoretical foundation for post-WIMP interfaces, ubiquitous computing, and reality-based interaction.
In recent decades, a synthesis and formalization trend has emerged under the banner of Computational Cognitive Modeling. This paradigm seeks to bridge the gap between high-level cognitive theory and predictive engineering by building executable models of user behavior. It employs architectures like ACT-R and SOAR to simulate and predict detailed interaction patterns, creating a rigorous, generative link between theory and design. This represents a return to formal modeling but informed by the richer understandings of cognition from the situated and embodied turns, aiming to produce quantitative, testable predictions for complex interactive behavior.
Today, cognitive HCI continues as a pluralistic subfield where these major paradigms—Information-Processing Psychology, Situated and Distributed Cognition, Embodied Interaction, and Computational Cognitive Modeling—coexist and intermingle. The agenda is no longer about finding a single correct model of the user but about strategically applying these different theoretical lenses to understand the multifaceted nature of human experience with technology, from routine efficiency to skilled mastery and situated meaning.