Houdini Tutorial Week 4

This week we talked about some courses about materials, lighting and rendering, which are similar to the properties of other 3D software, so it’s not difficult to understand. Then I learned about some of Houdini’s own shaders and rendering engine (mantra), and how to render particles.

  1. RGB is the acronym of red, green and blue.
  2. “RGB color space” generally refers to “all colors” used in hardware and software
  3. SRGB is a specific type of RGB.
  4. SRGB is very popular, but its color gamut is very limited.
  5. Adobe RGB is a color space containing sRGB and CMYK.
  6. Pro Photo RGB is a more extensive color space mode, which is generally used for color management.

ACES

ACES is a color system that’s meant to standardize how color is managed from all kinds of input sources (film, CG, etc), and provide a future-proof working space for artists to work in at every stage of the production pipeline. Whatever your images are coming from, you smoosh them into the ACES color standard, and now your whole team is on the same page.

For CG artists, a big benefit is the ACEScg color gamut, which is a nice big gamut that allows for a lot more colors than ye olde sRGB. Even if you’re working in a linear colorspace with floating-point renders, the so-called “linear workflow”, your color primaries (what defines “red”, “green” and “blue”) are likely still sRGB, and that limits the number of colors you can accurately represent.

Displacement Rendering

  • Displacement mapping is usually used to represent the height fluctuation of objects in rendering.
  • The effect is usually to move the position of the point along the normal of the face by a distance defined in the map.
  • It gives the texture the ability to express detail and depth.
  • It can also allow self covering, self projection and edge contour presentation.
  • On the other hand, compared with other similar technologies, this technology consumes the most performance, since it needs a lot of additional geometric information.

Rendering Engine

RenderMan has powerful shader compiler and anti animation blur function, which enables designers to create ultra complex action movies. At the same time, it also has a function that can not be ignored, that is, its authenticity. RenderMan can render photo realistic images, so its application in industry is very popular. RenderMan compatible renderers are widely used in the production process of high-end moving images due to their excellent rendering quality and fast rendering ability. In today’s high-end fields such as animation movies and special effects, RenderMan compatible renderers are an indispensable rendering solution and there are many problems in the world Famous production companies like ILM and Sony all use it as one of the final solutions for rendering.

Redshift has the function of exclusion and the rendering sampling is adjustable. It has fast speed and complete functions with dazzling features: Fog / volume light proxy / instance function. Redshift’s efficient memory management allows rendering scenes with hundreds of millions of polygons and terabytes of texture data. Using GI technology based on offset points and brute force GI, extremely fast indirect lighting can be realized. Taking advantage of the original functions of GPU and using intelligent sampling technology, redshift has become the fastest renderer in the world. Users can export objects and light groups to redshift proxy files, which can be easily referenced by other scenes. Proxies allow powerful shaders, masks and visibility flag overlays, which are often required in production.

Arnold is currently a CPU render (GPU version is under development), a movie level rendering engine based on physical algorithm, and a very suitable for artists. Animated films, special effects blockbusters, Arnold is everywhere. Arnold excels in various complex projects. Arnold design framework can be easily integrated into the existing production process. It is built on the pluggable node system. Users can expand and customize the system by writing new shaders, cameras, filters, output nodes, procedural models, light types and user-defined geometric data. The goal of Arnold framework is to provide a complete solution for animation and VFX rendering.

Mantra is the highly advanced renderer included with Houdini. It is a multi-paradigm renderer, implementing scanline, raytracing, and physically-based rendering. You should use the physically based rendering engine unless you have a good reason to use another engine. Mantra has deep integration with Houdini, such as highly efficient rendering of packed primitives and volumes.

Mantra and Rendering

  • Nodes corresponding to renderers and scene description formats (such as the Mantra node and RenderMan node). These nodes output scene description files and call the appropriate renderer to render the file.
  • Nodes for generating other outputs, such as the Geometry node which “renders” the scene geometry to a geometry format such as .bgeo or .obj.
  • Utility nodes to control renders and dependencies. For example, you can use the Merge node to sequence renders. You can use the Switch node to switch between different render nodes based on an expression.

The mantra output driver node uses mantra (Houdini’s built-in renderer) to render our scene. We can create a new mantra node by choosing Render ▸ Create render node ▸ Mantra from the main menus. In general, rendering in Houdini uses a camera defining the viewpoint to render from, lights to illuminate the scene, and a render node representing the renderer and render settings to use. However, we can still make preview renders using the current view, a headlight, and default render settings.

We can do most of your work in the Render view to see an interactive updating render. This lets us assign materials and change render node and shader node parameters and see the results as you work. In the render node’s parameter editor, click Render to Disk or Render to Mplay.

The Valid Frame Range menu controls whether this render node renders single frames or sequences (animations). Choose Any frame to render single frames. Choose Frame range to render a sequence. Houdini uses a mathematical pinhole camera to simulate a camera. Because a pinhole camera does not have in-camera effects such as depth of field and bokeh, we must be explicitly tell mantra to simulate them. The size of the rendered image is controlled by a parameter on the camera.

Mantra attribute

  • Render to Disk — Renders with the last render control settings, using the path specified in Output Picture.
  • Render to MPlay — Renders with the last render control settings, redirecting rendered frames to MPlay, instead of the specified path. If enabled, deep images and cryptomatte images will still be written out to their specified output path.

Controls whether this render node outputs the current frame (Render any frame) or the image sequence specified in the Start/End/Inc parameters (Render Frame Range). Render Frame Range (strict) will render frames START to END when it is rendered, but will not allow frames outside this range to be rendered at all. Render Frame Range will allow outside frames to be rendered. This is used in conjunction with render dependencies. It also affects the behaviour of the ‘Override Frame Range’ in the Render Control dialog.

  • Render Current Frame — Renders a single frame, based on the value in the playbar or the frame that is requested by a connected output render node.
  • Render Frame Range — Renders a sequence of frames. If an output render node is connected, this range is generally ignored in favor of frames requested by the output render node.
  • Render Frame Range (Strict) — Renders a sequence of frames. If an output render node is connected, this range restricts its requested frames to this frame range.

Noise Level

  • Represents a threshold in the amount of variance allowed before mantra will send more secondary rays. Variance essentially represents how “spread out” the values in a set of samples are. For instance, a set of samples that were all the same would have a variance of 0. It is generally a good idea to keep this value as high as possible so that rays are sent only into those areas where an unacceptable amount of noise is present.
  • Adding “direct samples” and “indirect samples” image planes can help us track how many samples are being sent and to which parts of the image. For more information about sampling, see the “Sampling and Noise” section.
  • If we find that certain objects in our scene require substantially more samples than other parts of our image and we are unable to “target” those objects using the Noise Level parameter, it may be a better idea to add per-object sampling parameters to the problem areas.

Diffuse Quality

Controls the quality of indirect diffuse sampling (for information on the difference between direct and indirect rays, see sampling and noise). Often, indirect sources of light (such as the surfaces of other objects, and light scattered inside of a volume) will be a significant cause of noise in your renders. Turning this up should decrease this type of noise, at the cost of slowing down rendering.

Lighting

Environment lights illuminate the scene from a virtual hemisphere (or sphere) that is beyond that farthest geometry objects in the scene. Environment lights can be rotated to orient directional illumination, but they cannot be translated. An environment light may use a texture map to provide HDRI illumination from an environment map. With no rotation, the environment map is oriented so that the top face aligns with the positive Y axis. Environment map to control the colour and intensity of light arriving from different directions.

HDR (Resource https://hdrihaven.com/hdris/?c=all)

HDR mapping refers to the environment mapping with high dynamic range in 3D and other image software. Generally, HDR mapping is a “seamless mapping” made of “HDR photos” (seamless mapping is a picture that allows the edges of the picture to be connected up, down, left and right, and can’t see the seams or traces). HDR maps are generally natural scenery or indoor environment.

HDR map has a high dynamic range of illumination information data image, the ordinary image is 8bit, and HDR map is 32bit. That is to say, he has more grayscale details and richer details. High dynamic range image, which is closer to the dynamic range of the human eye, even more than the human eye. In short, it is a photo with rich bright and dark details. This is a kind of image processing software to make up for the lack of dynamic range of the camera and take multiple photos with the same position but different exposure at the same time.

The role of HDR mapping architecture, home, still life, machinery, film and television and post production and so on the rendering of the model will need such mapping. It plays an important role in: as an environment background (such as rendering building models, blue sky, white clouds, trees in the background, etc.). Or as the illumination and reflection light source of the rendered model. For example, when rendering high reflective models such as cars or stainless steel, HDR map will be used as the ambient light, which can not only achieve the illumination effect of reflector, but also produce rich and realistic natural reflection effect on the surface of the rendered object.

Light Objects are those objects which cast light onto other objects in a scene. With the light parameters you can control the colour, shadows, atmosphere and render quality of objects lit by the light. Lights can also be viewed through and used as cameras.

  • Point — A light that emits light from a specific point in space defined by the transform for the light.
  • Line — A line light which is from (-0.5, 0, 0) to (0.5, 0, 0) in the space of the light.
  • Grid — A rectangular grid from (-0.5, -0.5, 0) to (0.5, 0.5, 0) in the space of the light.
  • Disk — A disk shaped light. The disk is a unit circle in the XY plane in the space of the light.
  • Sphere — A sphere shaped light. The sphere is a unit sphere in the space of the light.
  • Tube — A tube shaped light. The first parameter of Area Size controls the height of the tube and the second controls the radius.
  • Geometry — Use the object specified by the Geometry Object parameter to define the shape of the area light.
  • Distant — A directional light source infinitely far from the scene. Distant light sources cast sharp shadows, and so are candidates for the use of depth map shadows.
  • Sun — A finite sized (non-point) directional light source infinitely far from the scene. Sun lights are similar to distant lights with the exception that they produce a penumbra – similar to the actual sun with Soft shadows.

Colour — The colour of the light source.

Intensity — The linear intensity of the light source. If the intensity is 0, the light is disabled. In this case, the light will only be sent to the renderer if the object is included in the Force Lights parameter of the output driver.

Exposure — Light intensity as a power of 2. Increasing the value by 1 will double the energy emitted by the light source. A value of 0 produces an intensity of 1 at the source, -1 produces 0.5. The result of this is multiplied with the Intensity parameter.

the combination of the two lights — to present the detail of the HDR mapping with the colour and type of the traditional light

https://www.sidefx.com/docs/houdini/render/sampling.html

Material and Arnold

(resource: https://docs.arnoldrenderer.com/display/A5AFHUG/Sampling)

Principledshader — The goal of this shader is to produce physically plausible results while using intuitive rather than physical parameters. A large number of materials can be created with relatively few parameters. All parameters are in the zero to one range and represent plausible real-world values within that range. Textures can be applied to all relevant parameters. Note that the texture value is always multiplied with the value of the parameter.

two ways to give the material

the parameter of the object / go into the object and add & connect the material node

Priority

Arnold

arnold
with arnold light

Add the UVproject node

UVProject creates the UV texture attribute if it does not already exist. The attribute class (Vertices or Points) is determined by the Group Type. It is recommended that UVs be applied to vertices, since this allows fine control on polygonal geometry and the ability to fix seams at the boundary of a texture.

The best way to visualize the effects on UVs is in the UV view. To change a viewport to show UVs, click the View menu in the top right corner of a viewport and choose Set View ▸ UV viewport.

modify the uv show the texture
glass material

Render Particles

Create the arnold render in Out & arnold light with skydome ( texture — hdr image) in Obj & arnold material builder (with standard surface) in Mat

since the crag model has its own material so should add the attributedelete node to remove the mantra material and give the arnold material

To give the different shader to the same object with two geometry

Add the material node in geometry and select the group and give the material ( to select the group name should check the object separate name )

Group — A list of primitives (or points, if Attributes is set to Point attributes), or the name of a group, to assign the material to.

Number of material — increase to two

Number of materials — The number of materials to assign. This is useful for assigning materials to various groups of primitives. You cannot layer materials – if you assign multiple materials to the same primitive, the last material will override the previous ones.

Roughness — change the Highlights and Reflections

Curvature — This VOP computes the amount of surface curvature, with white in convex areas, black in concave areas and 50% gray in flat areas. This is useful for masking wear like scratches and dents, which often happen on raised edges.

add the ramp and combine the curvature and standard surface

Copy the ramp and connect to specular roughness to make the specular more details

Add the arnold material to the particle

At first, I found that I didn’t have Arnold. Later, I added an attribute to the parameter and it appeared.

Change the arnold > points > mode (can change it into sphere / disk / square) and point scale

  • user_data_rgb> Cd connect with the basic color
  • user_data_float> life
  • user_data_float> age
  • Add user_data_float> life user_data_float> age connect to the input and input max
  • Add the ramp and ramp_rgb

Motion blur

render sequence

I found that Arnold’s rendering time was about 2 minutes for each image. I thought it was too long, because it took more than 8 hours to render 400 images. So I asked Mehdi how to speed up, and he told me that I could reduce the light sampling value. At the same time, I felt that the size of the particles was so big that I could see round particles. Then Mehdi told me that I could try to change the mode to disk. Then I rendered it again, and the effect was slightly different.

Render Destruction

Last week, I didn’t complete the destruction of different materials because I couldn’t find the right group. Then I asked Mehdi and found the solution, that is, checked an option.

Blast — group (select the separate object) and select Delete non selected

Then I gave them different colors and tried to render with mantra.

  • roof & ground -concrete
  • glass- glass
  • wall – wood
wood material

Assemble — This operator is used to finish the process of breaking a piece of geometry. It uses the groups and connected pieces created by the Breakoperator to output a set of disconnected pieces.

Block start & Block end

ball transform
node
This entry was posted in Houdini & Lighting. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.