Touching the latest physics-based renderer Mitsuba2 (2) Running from Python

What is Mitsuba2

A free physics-based renderer for academics. There are new features such as differentiable rendering and polarized rendering. This article is a continuation of Introduction. It is assumed that the installation of Mitsuba2 has been completed.

Other related articles I wrote.

-Touch the latest physics-based renderer Mitsuba2 (1) Introduction -Touch the latest physics-based renderer Mitsuba2 (3) Differentiable rendering -Touch the latest physics-based renderer Mitsuba2 (4) Polarized rendering -Difference between BRDF and polarized BRDF (pBRDF)

Run Mitsuba2 from Python

As officially written Mitsuba2 provides very powerful Python bindings and almost all features Can be used from Python. Of course, you can also import it on Jupyter Notebook and perform various developments interactively. Furthermore, it is assumed that it will work with PyTorch, and optimization using PyTorch can be described very easily.

First of all, try moving

** The procedure is for Windows 10 (information as of March 10, 2020). ** ** The official has provided Python rendering test code, and my goal is to be able to run it. This time, if you have finished Introduction (1), I don't think there is anything that will catch you. I will explain step by step.

Supported versions of Python

According to the official It is said that it supports 3.6 or higher, so if it is less than that, update it. ..

Pass through PATH

First, you need to put the mitsuba2 module in your PATH so that it can be imported in Python. There are several ways to do this, but here we will use the script provided by the official. It is assumed that the library and executable file built in Introduction (1) are in mitsuba2 \ build \ dist \. There is a batch file called setpath.bat in the root directory, so hit it from the command prompt (cmd) (If you use Power Shell, hitting bat does not return environment variables, so you need to rewrite the script ).

mitsuba2> setpath.bat

This will pass the PATH to the executable and Python modules "in the executed cmd" and Mitsuba2 will be ready to use anywhere. Let's start ipython and import mitsuba2.

mitsuba2> ipython
Python 3.7.4 (tags/v3.7.4:e09359112e, Jul  8 2019, 20:34:20) [MSC v.1916 64 bit (AMD64)]
Type 'copyright', 'credits' or 'license' for more information
IPython 7.10.1 -- An enhanced Interactive Python. Type '?' for help.

In [1]: import mitsuba

If there is no import error, the PATH to the mitsuba2 module is in place. ** It is mitsuba2, but please note that the module to import is "mitsuba". ** **

Now, let's move the test code right away. In "cmd that executed the script", move to the directory of the sample code prepared by the official. The directory is located in mitsuba2 \ docs \ examples \ 01_render_scene \.

mitsuba2> cd docs\examples\01_render_scene\
mitsuba2\docs\examples\01_render_scene>

Render_scene.py in the directory is sample code that renders by hitting mitsuba2 in Python. Let's open it with a suitable editor.


import os
import numpy as np
import mitsuba

# Set the desired mitsuba variant
mitsuba.set_variant('scalar_rgb')

from mitsuba.core import Bitmap, Struct, Thread
from mitsuba.core.xml import load_file

# Absolute or relative path to the XML file
filename = 'path/to/my/scene.xml'

# Add the scene directory to the FileResolver's search path
Thread.thread().file_resolver().append(os.path.dirname(filename))

# Load the actual scene
scene = load_file(filename)

# Call the scene's integrator to render the loaded scene
scene.integrator().render(scene, scene.sensors()[0])

# After rendering, the rendered data is stored in the film
film = scene.sensors()[0].film()

# Write out rendering as high dynamic range OpenEXR file
film.set_destination_file('/path/to/output.exr')
film.develop()

# Write out a tonemapped JPG of the same rendering
bmp = film.bitmap(raw=True)
bmp.convert(Bitmap.PixelFormat.RGB, Struct.Type.UInt8, srgb_gamma=True).write('/path/to/output.jpg')

# Get linear pixel values as a numpy array for further processing
bmp_linear_rgb = bmp.convert(Bitmap.PixelFormat.RGB, Struct.Type.Float32, srgb_gamma=False)
image_np = np.array(bmp_linear_rgb)
print(image_np.shape)

** First, let's move it before understanding the code. ** ** Since there is no scene file to be rendered as it is, prepare a scene file. I think that those who have done the rendering test in the introduction (1) already have it, but git clone the repository of the sample scene data somewhere else.

git clone https://github.com/mitsuba-renderer/mitsuba-data.git

You can specify the path of the local repository here, but for the sake of clarity, this time again, mitsuba_data / scenes / cbox is stored in each directory, and the current test code directorymitsuba2 \ docs \ examples \ 01_render_scene \ Please copy to.

Then read this scene file in your code and rewrite it in only three places to write it to the right place.

# filename = 'path/to/my/scene.xml'
filename = 'cbox/cbox.xml' # l.12
# film.set_destination_file('/path/to/output.exr')
film.set_destination_file('cbox/output.exr') # l.27
# bmp.convert(Bitmap.PixelFormat.RGB, Struct.Type.UInt8, srgb_gamma=True).write('/path/to/output.jpg')
bmp.convert(Bitmap.PixelFormat.RGB, Struct.Type.UInt8, srgb_gamma=True).write('cbox/output.jpg') # l.32

All you have to do now is run it in python. (Please pip install numpy)

python render_scene.py

When the rendering is completed successfully, two files, cbox \ output.exr and cbox \ output.jpg, are generated. exr is the same as the introduction (1), 32bit x 3ch rendered image, jpg is 8bit x 3ch compressed image. output.jpg

Understand the sample code

Now that we have successfully moved the sample code, let's understand the code step by step.

import os
import numpy as np
import mitsuba

I'm importing a module, nothing has changed.

# Set the desired mitsuba variant
mitsuba.set_variant('scalar_rgb')

Specifies the variant (rendering settings) of mitsuba2. Variants are explained in Introduction (1). ** This must be selected from those specified during CMake. ** If you want to render with another variant that you haven't built, you'll have to go back to CMake and rebuild. This time, we will render with "scalar_rgb", which is the most basic variant of RGB rays (rgb) that does not have SIMD conversion (scalar) and does not consider spectrum and polarization.

from mitsuba.core import Bitmap, Struct, Thread
from mitsuba.core.xml import load_file

Import Bitmap, Struct, Thread from mitsuba.core. Also, import load_file from mitsuba.core.xml. I will explain each when using it later.

# Absolute or relative path to the XML file
filename = 'cbox/cbox.xml'

# Add the scene directory to the FileResolver's search path
Thread.thread().file_resolver().append(os.path.dirname(filename))

# Load the actual scene
scene = load_file(filename)

Specify the scene file name in xml format to be read, and load it with the function load_file () that reads the scene file. Thread.thread (). File_resolver () solves the problem that the link to the file in the scene file is written as a relative path by specifying the name and path and setting it as the search path. Without this, you will get an error that the object file cannot be found.

# Call the scene's integrator to render the loaded scene
scene.integrator().render(scene, scene.sensors()[0])

Call the integrator (ʻintegrator) to render the imported scene. The first argument is the scene (scene) and the second argument is the sensor (scene.sensors () [0] `) that renders (described in the scene file). [0] is so that you can specify which of the several sensors you have.

# After rendering, the rendered data is stored in the film
film = scene.sensors()[0].film()

# Write out rendering as high dynamic range OpenEXR file
film.set_destination_file('cbox/output.exr')
film.develop()

Loads the film (scene.sensors () [0] .film ()) for the sensors described in the scene file. The film has the role of setting the data output and post-processing. Set the output file name with film.set_destination_file. Develop cbox \ output.ext withfilm.develop ().

# Write out a tonemapped JPG of the same rendering
bmp = film.bitmap(raw=True)
bmp.convert(Bitmap.PixelFormat.RGB, Struct.Type.UInt8, srgb_gamma=True).write('cbox/output.jpg')

It is 8-bit development processing. Returns a bitmap object that stores pre-film-developed content with film.bitmap (raw = True). ** The meaning of the argument raw is unknown because it has not been officially described yet. ** ** Furthermore, 8-bit development processing is performed on the bitmap object with convert. Specify RGB with Bitmap.PixelFormat.RGB, specify 8bit with Struct.Type.UInt8, and apply the SRGB Gamma tonemap with srgb_gamma = True.

# Get linear pixel values as a numpy array for further processing
bmp_linear_rgb = bmp.convert(Bitmap.PixelFormat.RGB, Struct.Type.Float32, srgb_gamma=False)
image_np = np.array(bmp_linear_rgb)
print(image_np.shape)
# (256, 256, 3)

With the same convert, I am getting a linear RGB value that is not 8-bit compressed with a numpy array. It has nothing to do with the output result, but ** you can see that the render result can be easily obtained with the numpy array. ** **

Summary

I moved the sample code and looked at the contents of the code to understand how to move Mitsuba2 from Python. I hope you found out that you can run basic rendering in Python with a very simple description. From the next time, I will explain how to use new functions such as differentiable rendering and polarized rendering.

-Touch the latest physics-based renderer Mitsuba2 (3) Differentiable rendering -Touch the latest physics-based renderer Mitsuba2 (4) Polarized rendering

that's all.

Recommended Posts

Touching the latest physics-based renderer Mitsuba2 (2) Running from Python
Touch the latest physics-based renderer Mitsuba2 (3) Differentiable rendering
Install the latest Python from pyenv installed with homebrew
Existence from the viewpoint of Python
Use the Flickr API from Python
Learn the basics while touching python Variables
Learning notes from the beginning of Python 1
The basics of running NoxPlayer in Python
Launch the Python interpreter from Git bash
From Python 3.4, pip becomes the standard installer! ??
Learning notes from the beginning of Python 2
[Python] Get the main color from the screenshot
I tried running two Jetson Nano hardware PWMs from the Jetson.GPIO Python library.
Get the contents of git diff from python
Identify the platform running Python (Kaggle / Colab / Windows)
Operate the schedule app using python from iphone
Use the nghttp2 Python module from Homebrew from pyenv's Python
Call Polly from the AWS SDK for Python
Try accessing the YQL API directly from Python 3
From the initial state of CentOS8 to running php python perl ruby with nginx