Git Product home page Git Product logo

throbbing / practical-path-guiding Goto Github PK

View Code? Open in Web Editor NEW

This project forked from tom94/practical-path-guiding

0.0 1.0 0.0 120.3 MB

Implementation of the research paper "Practical Path Guiding for Efficient Light-Transport Simulation" + improvements

Home Page: https://tom94.net/data/publications/mueller17practical/mueller17practical-author.pdf

License: GNU General Public License v3.0

CMake 2.11% Python 2.00% Shell 0.26% C 13.90% Makefile 0.01% C++ 69.55% XSLT 0.06% Batchfile 0.03% PowerShell 0.01% TeX 1.05% Objective-C 2.35% Objective-C++ 0.17% GLSL 0.14% CSS 0.08% Fortran 6.69% JavaScript 0.04% Perl 0.16% HTML 1.38% Gnuplot 0.01% Lua 0.03%

practical-path-guiding's Introduction

Practical Path Guiding for Efficient Light-Transport Simulation

This repository contains the authors' implementation of the guided unidirectional path tracer of the research paper "Practical Path Guiding for Efficient Light-Transport Simulation" [Müller et al. 2017] as well as several improvements to the algorithm that were presented in chapter 10 of the "Path Guiding in Production" SIGGRAPH'19 course. It also includes a visualization tool for the SD-Trees learned by the guided path tracer. The guided path tracer has been implemented in the Mitsuba Physically Based Renderer and the visualization tool with the nanogui library.

No Support for Participating Media

The guided path tracer in this repository was not designed to handle participating media, although it could potentially be extended with little effort. In its current state, scenes containing participating media might converge slowly or not to the correct result at all.

Example Renders

Unidir. path tracing (no NEE) + Müller et al. 2017 + improvements
unidir unidir unidir

Note: the above glossy kitchen scene is not bundled in this repository due to licensing. It can be bought here.

Improvements

This repository contains the following improvements over what was presented in the paper of Müller et al. [2017]:

  • Inverse-variance-based sample combination to discard fewer samples.
  • Filtered SD-tree splatting for increased robustness.
  • Automatic learning of the BSDF / SD-tree sampling ratio via gradient descent based on the theory of Neural Importance Sampling [Müller et al. 2018].

These improvements are described in detail in chapter 10 of the "Path Guiding in Production" SIGGRAPH'19 course.

To see the effect of each of the improvements in isolation, compare the above example renders with only the inverse-variance-based sample combination, only filtered SD-tree splatting and only sampling ratio learning.

Since the improvements significantly improve the algorithm, they are disabled by default for reproducibility of the paper's results. To get the optimal results with the improvements, simply add the following parameters to the integrator in the scene XML file

<string name="sampleCombination" value="inversevar"/>
<string name="bsdfSamplingFractionLoss" value="kl"/>
<string name="spatialFilter" value="stochastic"/>
<string name="directionalFilter" value="box"/>
<integer name="sTreeThreshold" value="4000"/>
<integer name="sppPerPass" value="1"/>

Each bundled scene comes with two XML files: scene.xml without the above improvements, and scene-improved.xml with the above improvements.

Scenes

The CBOX scene was not shown in the paper but is included in this repository. It was downloaded from the Mitsuba website and modified such that the light source points towards the ceiling. This makes this scene a good test case for indirect diffuse illumination.

The GLOSSY KITCHEN scene (from the above images) can be bought here. It contains difficult glossy light transport that greatly benefits from path guiding and automatic MIS weight learning.

The KITCHEN scene from the paper is included in this repository. It was originally modeled by Jay-Artist on Blendswap, converted into a Mitsuba scene by Benedikt Bitterli, and then slightly modified by us. The scene is covered by the CC BY 3.0 license. The kitchen illustrates the benefit of path guiding under a mix of difficult indirect and direct illumination.

The POOL scene—created by Ondřej Karlík—is bundled with the public source code of the method by Vorba et al. [2014]. The caustics on the floor of the pool are a good showcase of the effectiveness of path guiding under high-frequency illumination.

The SPACESHIP scene was not shown in the paper but is included in this repository. It was originally modeled by thecali on Blendswap, converted into a Mitsuba scene by Benedikt Bitterli, and then slightly modified by us. Due to its mix of highly-glossy and diffuse materials, the scene is an excellent test case for learned MIS weights between path guiding and BSDF sampling. The scene is public domain.

The TORUS scene is available for download on the Mitsuba website. It was created by Olesya Jakob. The torus, situated inside of a glass cube, gives rise to difficult specular-diffuse-specular light transport that most unbiased algorithms can not efficiently handle.

Implementation

  • The guided path tracer is implemented as the GuidedPathTracer Mitsuba integrator.
  • The visualization tool is implemented as a standalone program built on nanogui.

Modifications to Mitsuba

  • BlockedRenderProcess (renderproc.cpp, renderproc.h)
    • Allowed retrieving the total amount of blocks.
    • Disabled automatic progress bookkeeping.
  • GuidedPathTracer (guided_path.cpp)
    • Added the guided path tracer implementing [Müller et al. 2017].
    • Additionally, implemented the following improvements that are not implemented in the paper:
  • ImageBlock (imageblock.h)
    • Allowed querying the reconstruction filter.
  • MainWindow (mainwindow.cpp)
    • Removed warning about orphaned rectangular work units (occured when multiple threads write into spatially overlapping blocks at the same time).
  • General
    • Added GuidedPathTracer to src/integrator/SConscript (for compilation) and src/mtsgui/resources/docs.xml (for mtsgui).
    • Changed the Visual Studio 2010 project to a Visual Studio 2013 project to make our integrator compile.
    • Removed the Irawan BSDF to make mitsuba compile under newer GCC versions.
    • Fixed various issues of the PLY parser to make mitsuba compile under newer GCC versions and clang.

Modifications to nanogui

  • ImageView (imageview.cpp, imageview.h)
    • Changed the shader to display a false-color visualization of a given high-dynamic-range image.
  • General
    • Removed noexcept qualifiers to make nanogui compile under Visual Studio 2013.
    • Removed constexpr qualifiers to make nanogui compile under Visual Studio 2013.

Compilation

Mitsuba

To compile the Mitsuba code, please follow the instructions from the Mitsuba documentation (sections 4.1.1 through 4.6). Since our new code uses C++11 features, a slightly more recent compiler and dependencies than reported in the mitsuba documentation may be required. We only support compiling mitsuba with the scons build system.

We tested our Mitsuba code on

  • Windows (Visual Studio 2013 Win64, custom dependencies via git clone https://github.com/Tom94/mitsuba-dependencies-windows mitsuba/dependencies)
  • macOS (High Sierra / Mojave, custom dependencies via git clone https://github.com/Tom94/mitsuba-dependencies-macOS mitsuba/dependencies)
  • Linux (GCC 6.3.1)

Visualization Tool

The visualization tool, found in the visualizer subfolder, uses the CMake build system. Simply invoke the CMake generator on the visualizer subfolder to generate Visual Studio project files on Windows, and a Makefile on Linux / OS X.

The visualization tool was tested on

  • Windows (Visual Studio 2013-2017 Win64)
  • macOS (High Sierra)
  • Linux (GCC 6.3.1)

License

The new code introduced by this project is licensed under the GNU General Public License (Version 3). Please consult the bundled LICENSE file for the full license text.

The bundled KITCHEN scene is governed by the CC-BY 3.0 license.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.