In my last post, I put up a link to a rough render of the last shot of Snowmen and said that it needed effects. Particularly, I wanted to include multiple particle systems for laser blasts, dynamic paint on the ground where the lasers hit the snow, and smoke simulations coming from the impact locations. I ran into some major challenges with rendering and compositing.
The first challenge was that halo particles always render behind faces with z transparency. So, for example, when the snowman blasts the camera directly, it simply doesn’t work with the zTransp sky in the background. Breaking the lasers out into a separate render pass and carefully recompositing fixed that.
The next gripe was that I could not get dynamic paint to render. The object it was supposed to paint on had previously been vertex painted, and the two were not playing nicely together. I had to do some serious workarounds to finally make it happen.
Finally, there were issues with the smoke. The renders were looking okay until I added some defocusing into the mix. The cube object that was the domain for the smoke was screwing up the z values within its volume, which obviously blows up defocus results. Even putting it on its own render layer and disabling it in the main one didn’t solve the problem. In the end, I had to selectively bring objects into a new scene and remove the smoke domain from the main scene, then put renders from multiple scenes together in comp.
While this was frustrating (cost me something like two days of work), I’m actually glad that it happened. The details of the process by which I discovered the problems and experimented to find a solution will make a great case study for the rendering and compositing chapter of Blender Productions, as well as providing some good instructional material for the simulations chapter. As always, my goal in things like this won’t be to necessarily teach about how to solve any specific problem, but how to approach complex problems in general.