Virtual production (VP) has emerged as an effective means of film production. A VP studio combines the capture of real foreground material with virtual backgrounds displayed on LED walls. While supporting image-based lighting (IBL) and in-camera visual effects (VFX), VP constrains lighting at capture-time, making downstream VFX non-trivial. Moreover, stage lighting is coupled to LED wall pose and display settings, limiting pre-visualization and leading to costly re-shoots when creative intent is not met. We address these challenges by introducing a VP 3D scene reconstruction and relighting (VSR) pipeline. This enables pre-visualization and post-capture modification of LED wall content and settings while faithfully propagating the corresponding lighting effects to foreground objects. We propose an unconstrained, geometry-independent Gaussian Splatting (GS) lighting model that encodes IBL texture sample coordinates and lighting intensities as deformable view-dependent Gaussian parameters. By avoiding ray transmission functions, we eliminate the need for depth or normal priors, naturally support transparency and reflections, and do not require custom CUDA/RTX code. Our representation is compact and efficient, requiring under 5 GB of RAM and VRAM to train 1080p scenes in under 2 hours. Objective and perceptual evaluations demonstrate state-of-the-art reconstruction and relighting capabilities, and shows that mip-map based IBL texture sampling outperforms other baselines by up to 3 dB in PSNR and 0.04 in SSIM. We show that in practice our approach reduces the reliance on VP-specific hardware and enables greater creative flexibility in both pre- and post- production.
@article{...}