Skip to content

Pytorch implementation of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis (Mildenhall et al., ECCV 2020 Oral, Best Paper Honorable Mention).

License

Notifications You must be signed in to change notification settings

DveloperY0115/torch-NeRF

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

torch-NeRF

Overview

Pytorch implementation of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis (Mildenhall et al., ECCV 2020 Oral, Best Paper Honorable Mention).

NeRF_Overview
NeRF Overview. Figure from the project page of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis, Mildenhall et al., ECCV 2020.

Getting Started

To configure the Python virtual environment and install dependencies, open the shell and run the following commands:

# Clone the repository
git clone https://github.com/DveloperY0115/torch-NeRF.git
cd torch-NeRF

# Create virtual environment
virtualenv venv -p=3.8
source venv/bin/activate

# Install dependencies
pip3 install -r requirements.txt

You may use any tools of your preference such as conda. Our code should reproduce the results regardless of which dependency management tool is used as long as the same versions of dependencies are installed.

To download the synthetic dataset for training, issue the following command in the shell:

sh scripts/data/download_example_data.sh

This should create data directory under the project root (torch-NeRF by default) and download datasets provided by the authors of NeRF (ECCV 2020).

The configuration is set for lego scene included in Blender dataset. Executing the following command initiates training:

python torch_nerf/runners/run_train.py

Once a scene representation is learned, you can render it using the script run_render.py under torch_nerf/runners directory. Note that you need to specify the path to the checkpoint file by modifying the yaml file under config/train_params.

The rendering script stores rendered images in render_out directory under the project root. To create a video from the collection of consecutive frames, use the script scripts/utils/create_video.py.

Gallery

NOTE: All images shown below are produced using our code.

Progress

Interesting Topics

About

Pytorch implementation of NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis (Mildenhall et al., ECCV 2020 Oral, Best Paper Honorable Mention).

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published