Free NeRF toolset Nerfstudio now supports 360° video
V0.1.19 Released 🎉
— nerfstudio (@nerfstudioteam) March 12, 2023
Open-source Neural Radiance Field framework Nerfstudio now supports 360-degree video.
Nerfstudio 0.1.19 supports equirectangular video, making it possible to generate 3D scenes matching footage recorded on 360-degree cameras and export them to DCC apps for use in games, VFX or visualisation.
An open-source library of modular components for NeRF-based scene reconstruction pipelines
First released last year, Nerftudio is a library of modular components for implementing Neural Radiance Fields (NeRF) into computer graphics pipelines.
Developed by researchers at UC Berkeley, Neural Radiance Fields (NeRF) are a new method for training a neural network to generate and optimise a volumetric representation of a scene from a set of source images.
Although its creators focused on NeRF as a means of synthesising new views of a scene, the volumetric representation can be converted into a 3D mesh, making it an alterative to photogrammetry for 3D scanning.
Generate 3D scenes matching source images and export them as point clouds or textured meshes
Using Nerfstudio, artists can train a neural network on a set of custom images and export the 3D data it generates to DCC software, either as point clouds in PLY format or textured meshes in OBJ format.
As of Nerfstudio 0.1.19, users can also process videos in equirectangular format, like those captured by 360-degree cameras, making it easier to reconstruct large outdoor environments.
You can get an idea of the results from the tweet embedded at the top of this story, which shows an aerial flythrough of a NeRF environment generated from video captured on a handheld 360-degree camera.
Licensing and system requirements
Nerfstudio can be compiled from source for Windows and Linux. The source code is available under an Apache 2.0 licence.
You can find a full list of dependencies and compilation instructions on the project website. In particular, note that Nerfstudio uses CUDA, so you will need a compatible Nvidia GPU to run it.