Blog-gallery

Byte-sized Tech Review — Realtime Graphics in Architecture

 

This tech review covers realtime graphics and rasterization: what it is, what its future looks like, and what its implications are for architects and designers.  

In short, it might be at the place where it can start replacing renders soon, so we would strongly recommend keeping this technology on your radar. 

 
Image rendered in Realtime in Unreal Engine, 2019

Image rendered in Realtime in Unreal Engine, 2019

Realtime graphics techniques are a little different than raytracing, which is the technique conventionally used by renderers such as VRay or Redshift. Just to recap how raytracing works, it simulates rays of light bouncing around a scene, (Tracing Rays = Ray Tracing). The advantages of ray tracing are photorealism: it’s a close to physically accurate simulation of light so you can get close to perfect results. The only downside is this simulation takes quite a bit of time to converge to a noise free image. 

Our favorite raytracing fact is that Ikea actually renders 90% of their catalogue... Cheaters. Here are some examples of what is possible with rendering. 

 
Raytraced Human Render: Illustrating complex rendering of Skin and Hair 

Raytraced Human Render: Illustrating complex rendering of Skin and Hair

 
Raytraced Glass Rendering: Illustrating rendering of complex refraction and reflection 

Raytraced Glass Rendering: Illustrating rendering of complex refraction and reflection

Raytraced kitchen environment from Ikea 

Raytraced kitchen environment from Ikea

Rasterization, the algorithm used in realtime graphics is a different process. Instead of tracing rays of light, rasterization works off approximations of how light works, not simulations. 

The advantages of rasterization are realtime interaction with the world: you can pick things up, look at them from different angles, and their lighting will update accordingly. Rasterization technology has been mostly driven by the game industry, as games generally require fully interactive spaces with realtime lighting. The downsides of rasterization are scenes are not quite photoreal: surfaces are not simulated with the type of complexity that raytracing provides, occasionally making things look plasticky and like they are from a video game. That being said, while rasterization is not perfectly photoreal, it’s getting pretty darn close.

Realtime Arch Viz using the Enscape Rhino Plugin 

Realtime Arch Viz using the Enscape Rhino Plugin

Realtime Human: From ‘The Heretic’, rendered in realtime in Unity3D 

Realtime Human: From ‘The Heretic’, rendered in realtime in Unity3D

Why Realtime is exciting: 

Realtime is exciting mostly due to how fast it’s evolving in comparison to raytracing. While rasterization continues to boom, raytracing has kind of stagnated a bit: the holy grail of ray tracing, photorealism, was essentially solved a decade back and so research has kind of slowed. Most of the work in raytracing research nowadays is in parallelizing algorithms (aka: speeding up renders) and improving certain surface definitions (aka: making them more photorealistic). 

Shot from Toy Story 1, 1995

Shot from Toy Story 1, 1995

Shot from Toy Story 4, 2019 

Shot from Toy Story 4, 2019

These above examples illustrate that the original raytracing technology was pretty impressive and that over 25 years, there have undoubtedly been improvements in rendering, but the gains are incremental. Realtime graphics on the other hand, have been improving almost exponentially year over year since the 90’s. The examples below show how far it’s come. 

Doom: State of the Art Realtime in 1993

Doom: State of the Art Realtime in 1993

Compare that to now...  

Realtime Arch Viz: Unreal Engine, 2019

Realtime Arch Viz: Unreal Engine, 2019

Realtime Zero Horizon Dawn Screenshot: 2019 

Realtime Zero Horizon Dawn Screenshot: 2019

Realtime Human: Unity3D ’The Heretic’ 2019

Realtime Human: Unity3D ’The Heretic’ 2019

Realtime Unity3D: Book of the Dead, 2018 

Realtime Unity3D: Book of the Dead, 2018

The LAB has been working with realtime for quite some time, and many of our past projects involve using these engines and technologies for live interactive experiences. In the past three years, there’s been an enormous push by consumer engines like Unreal and Unity into realtime raytracing and HDR rendering pipelines. This boom of the technology has driven us to continue to ramp up our efforts intorealtime technologies, and many of our upcoming projects are utilizing the photoreal aspect of these tools.

We’re also actively exploring other ways of using realtime engines for more utilitarian purposes: ideas we’ve floated around are tools to allow architects to modify lighting in VR, or a networked version of Revit models that would allow architects to gather data on how people move around and interact with a space. We’ll keep you updated on those prototypes in later blog posts. 

In short, there’s a world of potential in realtime engines, and they continue to improve year over year at a rate far above raytracing technologies. We look forward to seeing what the next decade brings with these innovations, and how other experiential design studios take advantage of them. 

Hope you enjoyed the read. 

By Brian Aronowitz, LAB Technologist


Some videos if you want to watch further

 
Tiffany Wu