Earlier this year NVIDIA advanced Neural Radiance Fields (NeRF) research notably with InstantNeRF, apparently capable of generating explorable neural

NeRF: The Challenge of Editing the Content of Neural Radiance Fields

submited by
Style Pass
2022-05-16 12:30:08

Earlier this year NVIDIA advanced Neural Radiance Fields (NeRF) research notably with InstantNeRF, apparently capable of generating explorable neural scenes in mere seconds – from a technique that, when it emerged in 2020, frequently took hours or even days to train.

Though this kind of interpolation produces a static scene, NeRF is also capable of depicting movement, and of basic ‘copy-and-paste’ editing, where individual NeRFs can either be collated into composite scenes or inserted into existing scenes.

Nested NeRFs, featured in 2021 research from Shanghai Tech University and DGene Digital Technology. Source: https://www.youtube.com/watch?v=Wp4HfOwFGP4

However, if you’re looking to intervene in a calculated NeRF and actually change something that’s going on inside it (in the same way you can change elements in a traditional CGI scene), the rapid pace of sector interest has come up with very few solutions to date, and none that even begin to match the capabilities of CGI workflows.

Though geometry estimation is essential to creating a NeRF scene, the final result is composed of fairly ‘locked’ values. While there is some progress being made towards changing texture values in NeRF, the actual objects in a NeRF scene are not parametric meshes that can be edited and played about with, but more akin to brittle and frozen point clouds.

Leave a Comment