BYU CS Logo
Computing That Serves

Cloud-Baked Lightmaps for Dynamic Indirect Illumination in VR

Nathan Zabriskie
MS Thesis Proposal
Friday, November 17, 3:00 PM
3346 TMCB
Advisor: Parris Egbert

 

Indirect light---light that has bounced at least once before reaching the viewer---provides richness and realism to rendered images, but it is expensive to simulate. Because of these high costs most real-time renderers in the past have replaced actual indirect light with a series of hacks and shortcuts, but they are poor substitutes for the real thing. With new advances in GPU hardware and algorithm design it has become possible to start simulating indirect lighting in real-time in traditional game environments. However, these new advancements have not yet made their way into virtual reality applications. In VR the already tight timing requirements of real-time games is further restricted. In addition, any change in framerate or lack of responsiveness can cause users to feel physically ill, rather than just annoyed.
 
In this proposal we outline a system in which the full rendering pipeline is split across multiple machines. A remote server performs the expensive indirect lighting calculations and bakes the results into textures. These textures are then sent to the client machine running a VR device which samples them as part of its normal rendering pipeline. By splitting up the calculations in this manner, client applications can benefit from the increased realism afforded by real indirect lighting without losing responsiveness or dropping frames.