In this short tutorial, I’ll share with you some efficient tips in order to drive the properties of individual scattered instance. There are several ways to drive scattered instances in Clarisse. But, the question is: how do we drive scattered instances individually depending on their position?
This video will first show you how to use TextureSupportColor and InstanceColor nodes to drive the color of each scattered instance.
Then, given that some attributes of the scatterer itself can be color/texture driven, it will walk you through some ways of using TextureSupportColor node to modulate these attributes over each individual scattered instance.
Among other things, this tutorial goes over many tips like color channels masking and output using the TextureReorder node to change color driven attributes without changing attribute values themselves.
Finally we are releasing Clarisse 3.0 codenamed Pegasus! You can already download the latest PLE and play with the updated contents. So what I can say is about this new version? Simply we’re all very proud of it! Yes, we’re very proud because we wrote a new path tracer from ground up. We’re also very proud because we have improved greatly look development in Clarisse through the new Material Editor, OSL support and advanced baking tools. We also revamped completely the manipulators including the implementation of snapping and alembic export to simplify layout. Scripting is now made simple thanks to python command output in Clarisse log. Just check out what’s new since 2.0 and remember this release note doesn’t include bug fixes! http://forum.isotropix.com/viewtopic.php?f=103&t=2293
A bit of background on this release
We started working on Pegasus core about 23 months ago. At the time we were just 9 at Isotropix and we end up releasing it with a team of over 30 people! We just tripled the company size and we keep growing!
3.0 is the result of hard work and close relationships with people from major studios such as DNeg, ILM and WETA but also smaller sized ones. The core of 3.0 is a tech that helped creating the VFX of many blockbusters including Star Wars The Force awakens, the upcoming Star Trek beyond and many more. Here is the up to date list: http://www.isotropix.com/company/feature-films
Clarisse 3.0 brings a new Physical (physically based) shading model which consists of a new Path Tracer renderer and new set of materials, volumes and lights. This new physically based engine provides state of the art shaders and advanced sampling techniques. All these have been possible thanks to the years of collaboration with Double Negative R&D rendering team and specially with the help of Emmanuel Turquin! You’ll see, it is now way simpler (maybe even too simple ;)) to create photo-realistic images with these new shaders in Clarisse.
The quick and dirty test
Old vs New shading model
While the new PBR engine a rendering cost, (it is slower for the same amount of samples) you do get a better sampling. This means that you should need less samples to get rid of the noise. The other good thing is that now Clarisse is finally able to render caustics (both from specular and transmission) as you can see in the previous image.
Please don’t get your hopes up too high: the new path tracer is still a unidirectional one. This means that, while Clarisse resolves refractive caustics, they require a large number of samples to get noise free.
Getting started with PBR
When you start Clarisse, Clarisse will prompt you with the default shading model you wish to use. Make sure to select: Physical. That way, the default scene will be created with the path tracer and a Physical Distant light.
What are the big changes?
There are two dramatic changes to what you were used to in the legacy shading model.
First, attributes controlling the sampling are now in sample per pixel (spp). This means that you set the actual number of sample you wish to use. In the legacy model, the actual number of samples fired were the square of the input value. For example, 8 meant 8×8 (64) samples, now, 8 spp means that 8 samples are actually fired.
This is true for any attribute controlling samples including antialiasing.
Secondly, the new shading model doesn’t provide anymore a GI light. GI is automatically on by default. The quality of the GI and all secondary rays (including reflection and transmission) is set by the attribute Material Sample Count you’ll find in the path tracer. Clarisse will perform MIS and automatically distributes the number of samples (see this as a sampling budget) to the different BxDF according to the material. While that works pretty well, sometimes, you may want to add more sample to the glossy reflection for example. In that case, oversampling specific channels to launch more rays for glossy reflections for example is still possible using the per ray type multipliers.
Moreover, while sampling is now controlled directly at the path tracer level, you can still override any of these values at the material level for finer controls!
A quick word about lights
There are now two kinds of lights: infinite and area lights. Infinite lights are lights that aren’t relying on a physical geometry such as environment and distant lights. On the contrary, area lights rely on a internal geometry such as (finite) plane, sphere, spot etc… You will also note that area light now properly takes into account arbitrary scaling. The good news is that this doesn’t affect overall sampling quality.
PBR feature minitour video
Finally, please forgive us but we didn’t have the time to complete the new PBR documentation for 3.0 RC1 release. Fortunately, Yann prepared a small video to guide you through and explain you the basics of this new PBR engine.
Clarisse 3.0 (Pegasus) introduces a new point_uv_sampler geometry that samples the UV values of the geometry support in UV space which makes it much faster and way more efficient than the point_cloud that does this evaluation in 3D space.
point_uv_sampler is not meant to replace the point_cloud.
As already mentioned the point_uv_sampler samples the UV colors of its geometry support in 2D space and as a consequence it can’t take any other scene object than a single mesh as geometry support unlike the point_cloud that can use a combiner.
This is the flip side of the 2D space evaluation but its advantage is that the point_uv_sampler will be much faster in evaluating its density according to UV colors.
Unless you really need to use combiner or any other scene object than a single mesh as a geometry support, the point_uv_sampler brings a significant gain of performance.
When you need to light a scene with light emitting objects, the worst case scenario is when you have a VDB object. With global illumination, your scene will work but the effect will be very expensive to render.
One of our clients needed to render cannon fire effects and thus illuminate the scene with each explosion flash. They found a trick to replace the global illumination effect by area lights dynamically textured by renders of the VDB sequences, and they dramatically reduced the render time with similar results.
In this video, I’ll show you this technique based on Clarisse’s ability to use a live render as texture. You’ll see how a small render of VDB explosion could be used to texture an area light. Thanks to this tip, you’ll get similar results, compared to a GI light, with faster render times!
The same technique can be used for a lot of tricky lighting scenarios, like indirect interior lighting.
Here is a reconstruction of the setup, using a simplified geometry.
Pro tips and tricks related to Clarisse from Isotropix’ Clarisse specialists.