Graphics rendering emerged as a core subfield of computer graphics, driven by the quest to generate synthetic images from digital models. Its early history was dominated by the Rasterization paradigm, which projects geometric primitives onto a pixel grid through scanline algorithms, enabling efficient real-time display in interactive systems. This approach, rooted in hardware limitations and the need for speed, established a durable school focused on incremental local computations and visibility determination via depth buffering. Parallel foundational work in illumination models, such as ambient, diffuse, and specular shading, provided the basic tools for simulating light-surface interactions within rasterized pipelines, setting the stage for more ambitious realism agendas.
The pursuit of photorealistic imagery catalyzed the Ray Tracing paradigm, which simulates physical light transport by tracing rays from the camera through the scene. This school, emphasizing global illumination effects like shadows, reflections, and refractions, became a rival to rasterization for offline rendering due to its computational intensity but superior image quality. Concurrently, the Radiosity paradigm gained prominence for handling diffuse interreflection in static environments, solving the rendering equation through view-independent finite element methods. These approaches collectively advanced the Global Illumination framework, which seeks to model all light paths for physical accuracy, often blending ray tracing and radiosity techniques into hybrid systems.
As computational power increased, the Physically Based Rendering (PBR) paradigm solidified as a dominant agenda, rigorously applying principles from optics and radiometry to unify shading, materials, and light transport. PBR emphasized energy conservation, microfacet theory, and measured material data, becoming a standard in both film production and real-time applications. This period also saw the Real-Time Rendering school evolve rapidly, initially relying on optimized rasterization with programmable shaders but gradually incorporating ray tracing through hardware acceleration, blurring the traditional divide between interactive and offline rendering paradigms.
Alongside realism-driven schools, the Non-Photorealistic Rendering (NPR) paradigm emerged as a distinct agenda focused on artistic expression, simulating styles like painting, sketching, or cel animation through algorithmic abstraction. NPR challenged the centrality of photorealism, expanding rendering's scope to communication and aesthetics. Today, graphics rendering continues to integrate these canonical frameworks, with rasterization and ray tracing converging in hybrid pipelines, while global illumination and PBR principles underpin both cinematic and interactive visual synthesis, reflecting a field shaped by enduring technical agendas.