Enlarge / This image from EA’s SEED group shows off realistic shadows, reflections, and highlights using DXR. (credit: Project PICA PICA from SEED, Electronic Arts)
At GDC, Microsoft announced a new feature for DirectX 12: DirectX Raytracing (DXR). The new API offers hardware-accelerated raytracing to DirectX applications, ushering in a new era of games with more realistic lighting, shadows, and materials. One day, this technology could enable the kinds of photorealistic imagery that we've become accustomed to in Hollywood blockbusters.
Whatever GPU you have, whether it be Nvidia's monstrous $3,000 Titan V or the little integrated thing in your $35 Raspberry Pi, the basic principles are the same; indeed, while many aspects of GPUs have changed since 3D accelerators first emerged in the 1990s, they've all been based on a common principle: rasterization.
Here’s how things are done today
A 3D scene is made up of several elements: there are the 3D models, built from triangles with textures applied to each triangle; there are lights, illuminating the objects; and there's a viewport or camera, looking at the scene from a particular position. Essentially, in rasterization, the camera represents a raster pixel grid (hence, rasterization). For each triangle in the scene, the rasterization engine determines if the triangle overlaps each pixel. If it does, that triangle's color is applied to the pixel. The rasterization engine works from the furthermost triangles and moves closer to the camera, so if one triangle obscures another, the pixel will be colored first by the back triangle, then by the one in front of it.
Read 13 remaining paragraphs | Comments