I can’t seem to be able to access the info lights are using to do lighting on the pyro sims and output that into something renderable… Anyone got some pointers?
I’m gonna repeat what people told me when I asked the same thing; if you’re rendering a volume without a surface, how do you expect to get surface normals?
In any case, there’s a couple of different methods of approximating the normals of a volume, most of which involve modifying the shader you’re using.
I usually go for the brute force-method of just creating three RGB lights and then tweaking them in comp.
To that end, here’s a link to a webpage someone (Caleb maybe?) posted a couple months ago that has a pretty good breakdown of creating a normal map from a 3 light set up:
Hey,
Create area lights with no attenuation in a cube around your volume. Label them front, right, left, top, bottom and back.
Your Front light should be on the same side as the camera. I would parent the lights to a null and have the null look at the camera so the front is updated.
Render out the all the lights in an .exr but exporting all lights.
In the comp context, load the .exr files. Create a channel copy and a map each light pass to a channel to create a normal. Right light into the red, top into the green and front into the blue.
You now have the forward normal,
You also have the information from the other sides to create a negative normal if you were inclined / have the budget to create volumetric shader and sample at least three textures
Hope that helps
If you are making anything that is emissive(fire/explosive) you want to render your temperature map separate.
Using the light method I would highly recommend using distant lights and not area lights. You want 5 lights. Your 3 positive and 2 negative. It is easiest to make sure the values go from 0 to .5 on all renders and then add a blanket .5 to your positive before subtracting your negative renders in Photoshop. Also be very careful with houdini because it has lots of ways it tries to force you into sRGB and you need your normal’s to be in a linear colorspace.
If your lucky enough to be using houdini 16 there is a texture sheet render node that makes this all pretty easy. It cranks out a normal and a color. The one that shipped with it doesn’t quite work right so check out this thread which will give you an updated ROP and a video on how to use it.
https://www.sidefx.com/forum/topic/48708/
I discovered this one still has some bugs. It will crank it out in sRGB. Make sure you convert to linear in another software before importing it (and before inverting the green channel if your using UE4). By default they also have the blue channel at a solid 1. I found a way to fix it on my end. Just Unlock the node for editing and copy what I did in the image below.
http://i.imgur.com/LM5i1z4.jpg
Hope this is helpful
I forgot to mention a general workflow thing. It’s best to check your light setup by doing this first with a solid sphere. Sample all your values on the final image to make sure its correct. It can be hard to tell with a volumetric render if values are correct, but with a perfect sphere it is the easy to understand what sampled values are supposed to be what. Once its correct, export that setup and just import the same setup every time you render. This is why I recommend using distant lights; They are scale/transform agnostic.
Thanks for the many answers and workflow guys! really appreciated! However as I tried some of your approaches and looked into things a bit deeper, in the end I went with a no light solution.
Turns out, you can output normals directly with a simple shader and a checkbox. I’ll take some screenshots tonight after work.
Cheers!
Any chance to get those screenshot?