We propose photorealistic real-time relighting and novel view synthesis of subsurface scattering objects. We learn to reconstruct the shape and translucent appearance of an object within the 3D Gaussian Splatting framework. Our method decomposes the object into its material properties in a PBR like fashion, with an additional neural subsurface residual component. To do so, we leverage our newly created multi-view multi-light dataset of synthetic and real-world objects acquired in a light-stage setup. We achieve high-quality rendering results with our deferred shading approach and allow for detailed material editing capabilities. We surpass previous NeRF based methods in training and rendering speed, as well as flexibility, while maintaining high visual quality.
Our method implicitly models the subsurface scattering appearance of an object and combines it with an explicit surface appearance model. The object is represented as a set of 3D Gaussians, consisting of geometry and appearence properties. We ultilize a small MLP to evaluate the subsurface scattering residual given the view and light direction and a subset of properties for each Gaussian. Further, we evaluate the incident light for each Gaussian as a joint task within the same MLP given the visibility supervised by ray-tracing. Based on the computed properties we accumulate and rasterize each property on the image plane in a deferred shading pipeline. We evaluate the diffuse and specular color with a BRDF model for every pixel in image space and combine it with the SSS residual to get the final color of the object.