In the last tutorial, we had looked at a very simple example usage of Mitsuba with only a single sphere primitive. We will now look at a more complete scene with the description of other entities like film, intgrator, bsdf etc. So lets get started.
In this scene, we will render a plane primitive with a checkered material. We will introduce the following new scene elements.
     
In this scene, we will render a plane primitive with a checkered material. We will introduce the following new scene elements.
- integrator (for the rendering algorithm used)
- sensor (for the camera settings like fov)
- film: a sub-entity of the sensor element (for the output image resolution and type)
- sampler: a sub-entity of the sensor element (for the number of camera samples and the type of sampling distribution to be used.
- plane (for rendering of our plane shape)
- bsdf: a child entity of our plane shape (for the checkered material)
Integrator:
This element tells Mitsuba the type of rendering algorithm to be used for the scene. For the first few tutorials, we will stick with the path tracing algorithm (type="path") for the integrator element. For this scene, the integrator element is defined as follows:
  < !--Setup scene integrator -- >
  < integrator type="path" >
    < !-- Path trace with a max. path length of 5 -- >
    < integer name="maxDepth" value="5"/ >
  < / integrator >
Sensor:
This element is analogous to a camera object. For this scene, we will use a perspective camera (type="perspective"). Its sub-elements transform controls the camera's position and orientation, the focusDistance element controls the distance of the camera focal plane. The fov element controls the camera field of view.
Film:
This is the sub-element of the sensor element. Its main attribute type controls the format of the output (ldrfilm or hdrfilm). If hdrfilm type is used, the output format is EXR. The sub-elements width and height control the output image dimensions.
For this scene, the sensor element is defined as follows:
For this scene, the sensor element is defined as follows:
  < sensor type="perspective" >
    < transform name="toWorld" >
      < translate x="0" y="0" z="-1"/ >
    < / transform >
    < float name="focusDistance" value="1"/ >
    < float name="fov" value="45"/ >
    < film type="hdrfilm" >
      < integer name="width" value="640"/ >
      < integer name="height" value="480"/ >
    < / film >
    < sampler type="independent" >
      < integer name="sampleCount" value="32"/ >
    < / sampler >
  < / sensor >
Plane:
The plane shape is defined by the type ("rectangle"). The position, orientation and scale of plane is controlled through the transform sub-element. To orient the plane properly, we rotate then scale and finally translate it. The material is controlled by the bsdf sub-element. Here we have defined a diffuse material. The bsdf element has a texture sub-element which has a uv scale of 32 (that is the texture tiling amount), color0 and color1 are the two checker colors and filtertype controls the texture filtering scheme to be used. This can be 
- EWA (for EWA filtering which is the default filter type)
- trilinear (for linear filtering)
- nearest (for nearest neighbor filtering)
For this scene, the shape element is defined as follows:
 < shape type="rectangle" >     
    < transform name="toWorld" >
      < rotate x="1" angle="-90"/ >
      < scale x="2" y="2" z="2"/ >
      < translate y="-0.1"/ >
    < / transform >
    < bsdf type="diffuse" >
      < texture type="checkerboard" name="reflectance" >
        < float name="uvscale" value="32"/ >
        < rgb name="color0" value="0,0,0"/ >
        < rgb name="color1" value="1,1,1"/ >
        < string name="filterType" value="EWA"/ >
      < / texture >
    < / bsdf >
  < / shape >
The entire scene description given above gives us the following output.
|  | 
| Output from the CheckeredPlane.xml scene | 
The scene file may be downloaded from here: https://www.dropbox.com/s/dtz7y9svuov7gxk/CheckeredPlane.xml 
 
 
 
0 comments:
Post a Comment