The result is awesome and I was wondering how it was made:
-Is it from a sim, if yes, what software?
-Is it a flipbook of one sim only?
-Or is it a composition of multiple sim into one flipbook?
-How is the tail of drips fading out?
It’s more to understand how the behavior was made for it to look like this in the end.
I kind of see a philosophy of, there is droplets appearing and then the drip catch them while sliding and the tail is fading out… But that’s as far as I am right now O_O
A flipbook would yield lowres results for something like this. Set up a mask for the drop path, then scroll at gradient through it. Use the result to generate normals on the fly and you are nearly there. Repeat the texture over the window, with some random offsets or even on particles, if you have the budget.
Yup - @Partikel nailed it. We favored doing as much as we could in the shader whenever possible. This was @Eben’s mastery, and comes up in discussion quite a bit. Eben, would you be interested in sharing more details here?
So i guess that once we have our panning material set up we can create like 2-3 meshes with random flow so we could have multiple ‘‘path, shape’’ and we could change the time at which the panning start and we have a lot of different results?!
Yep, that’d work. There are tons of ways to do it.
I’ve used stretched particles attached to the camera for this sort of thing in the past for screen effects. It cost me some cputime, but it reduced the overdraw and I got a whole bunch of randomization options for free. It’s all up to what your needs are.
mayby some fancy RT material - 1 pass to apply gravity and 2nd pass to lightning this. U can use 1 rendertarget for whole scene . just add some offset to uv on every window. ( only if u got tilable RT). That would do it.
@Partikel has it right. We didn’t generate the normals, but did a texture sample instead. But you could generate them. And to get variation I just used a UV distortion map. Didn’t use meshes @Alexandre_GM.
It’s the same idea as Z-Aligned Snow, but instead of just using a z-gradient, we use the vertex normals to mask based of the object’s surface slope as well, and use it to blend between world-Aligned patter / streaking.
This video of the Driveclub rain simulation has been posted in the vfx facebook group.
Someone on here who worked on it? How was it done? Is it completely physx simulated?
It’s not PhysX, but it is a simulation. Fun fact, every car had to be set up by hand because of the different wiper placements, speeds, windscreen shapes and so on.
I know this is a much older thread but I actually did some tests into a system like this. My solution was to do the calculations in 2D elsewhere then render it to a texture and apply it to the window. Planning on expanding on this in the future to support velocities, etc