The goal i am trying to achieve is to create a 2d lightshafts shader that can be applied to particle sprites for pseudo volumetric glows. This is the same effect many games use to simulate sun light shafts etc. (before everything went voxel based) I wanted it to be as simple and effiecient as possible. I dont want to have to setup render target camera or any extra entities in the world for it to work. Ideally it would be a simple shader you apply to the sprite. Chapter 13. Volumetric Light Scattering as a Post-Process | NVIDIA Developer
Most engines do this process in screenspace. However, since i want this applied to a particle sprite, it needs to sample the depth of the scene and compare it to the depth of the particle. My idea was to create a material, sample the scene depth and pixel depth and do an “if” statment to color it black and white (black is all geo in front, white is behind. I would then save that material to a render target which is then used for the radial blur and compositing into the scene.
Unfortunately this isnt working. For some reason the render target isnt accepting the data. It seems to be missing the depth info. However this is not the only problem, if the RT is created using geometry thats in the world, then it needs to exist somewhere in the world at all times. The final material on the particle sprite is not this material that creates the RT, but a new material that is reading and blurring the render target. I experimented with having 2 sprite renderers in my niagara system (one for the RT material and one for the final effect) but no luck.
Does anyone know of a way i can achieve this result? I essentially need to make a black and white mask of the scene depth, with the seperation determined by the depth of the sprite, and have that added into a render target. I did get some results using the scene capture 2d camera but it requires that extra entity in the world and some extra setups. (is there a way to set the current/local cam to that scene capture2d camera?) Another thing to consider is what to do if more than one of these are on screen…
Here are some pics…
This is the black and white mask. The material is currently applied to niagara particle (for visualization). I want a result like this in the RT.
I have been able to get a radial blur on my black and white mask and composite it in the scene, without the need for a render target. The setup is still quite rough though as the entire thing is done in the material. It has 40 samples and each sample is a seperate block of nodes. As you can imagine, tweaking the values is a huge hassle. I would like to get it into hlsl eventually. There are also a few strange things like the lightshafts bending when near the edge of the sprite, still not sure about that. I suspect its my samples falloff/distance settings. Its probably not the cheapest thing either. Im will look for ways to optimise it in the future.
If anyone is good with hlsl maybe you can lend you your expertise? I have a block of hlsl that does everything my massive node setup does, but all in a single custom node. However the code is written to accept a texture sampler input and wont work with my black and white mask input. Would somone be able to modify the code to accept inputs from other material nodes? I am clueless when it comes to programming. (slowly learning though) Here is the code…
float3 blur = 0;
const float2 blurVector = -normalize(UV - center);
float sum = 0;
for (int i = 0; i < sample_count; i++)
{
const float t = i / float (sample_count);
const float offset = i * dist;
const float w = exp (-t * falloff);
blur += Texture2DSample(Tex, TexSampler, UV + blurVector * offset).rgb * w;
sum += w;
}
return blur * rcp(sum);
There is also an issue with the samples wrapping around the uvs if you use too much space between samples. (trying to make the blur really long) If there is a way to set it to clamp instead of wrap, that would be awesome!
Bumping this as I’m also trying to figure out how to make post process lightshafts in unreal, so far I’ve experimented with a radial blur effect using hlsl in a custom node but I’m not getting good results (still learning), my problem is that I can get sort of a lightshaft effect but I need the effect to appear to be static and not change with the camera orientation.
I would like to know too if there is a way to sample a mask directly in a custom node and avoid using a scene capture 2D to write a texture, I’ve checked everywhere online and didn’t find much information that could help on how to achieve this in unreal.
I was able to get it all working with the help of a programmer using a custom material node but unfortunately I cant post the code here. I did find a lens flare package on the store that also had these same type of light shafts, that could be an option.
If you’re simply doing sun shafts, unreal already has some built in solutions for this. The only reason I was doing it in the material is because I wanted it to apply to a particle. Our solution did work, but it ended up being a bit too expensive on consoles as it was sampling every pixel in the glow and doing a radial blur with 16-18 samples, every frame…
Also, since it was screen space, it would disappear when off screen. If i forced it to render offscreen it would just turn into a glow because all the geometry that is used to create the shafts is being culled. If you want offscreen light shafts such as the vertical ones in your picture, I would look into unreals voxel/volumetric fog for your light shafts.