Procedural normals for threshold animations

The technique in that talk reminds me of the technique Guerilla used in Killzone 3 for the stealth effect, though the idea of using a normalized 2d vector is great. Means you just need a single RGB texture to do everything (plus the lookup texture for the edges, and the screen texture for the fresnel stuff).

I used that technique for MNC’s fodder taking damage over time to add scratches and pock marks to their surface. Super useful. Doing it with texture derivatives or multiple offset samples makes a bit more sense today as you’ve got a ton of ALU to spare, where the PS3 was a bit more restricted so a texture made more sense for Last of Us

True, now that I’m reading it again it’s starting to make sense. Thanks for pointing it out, really helped me figure it out :slight_smile: I’ll try some stuff out and see if I can tweak this lil tool of mine!

@Partikel your method seems to work fine too, slightly different aspect. Eben’s normals have this film aspect to it, like a pizza dough with thick borders. But like Eben mentioned, they’re on screen for such a short amount of time that I believe that timing and shape are so crucial that shading could be just okay and it’d be great.

1 Like

Remember, DDX and DDY are always around

2 Likes

Hmm, I need to try that and see if the result is better.

Any idea of cost? Instinctively it feels more expensive to get the derivative than a subtract. Though, it might be safer as you don’t risk an extra texture call if you go to far from the originial pixel…
Time, I need more of it.

They are always pre calculated even if you don’t use them since that’s what the hardware uses to decide which mip map to choose. Can get a bit aliasy though

1 Like

Yeah, people are always scared of derivatives, but they are nearly free. Certainly cheaper than resampling the texture 2 or 3 more times. Like @mattorialist mentioned it’s using data that’s already been calculated, a ddx or ddy is effectively just the cost of a single subtraction, but shared between 4 pixels. Even fwidth is just two subtraction and an add, again shared between 4 pixels.

For to reduce aliasing from using derivatives you can try using ddx_fine instead. Since ddx is calculated for each 2x2 pixel group, the value that ddx returns for all 4 of those pixels is the same. Using ddx_fine it is calculated for each 2x1 pixel group. There’s still a chance for aliasing, but it can help a lot in certain situations.

A bit off topic @Bruno, but do you have any tutes you could point me to for learning Substance as it pertains to vfx stuff? I don’t supposed you’d ever do a quick write up or video yourself on a quickstart guide to substance for fx work or texture generation for vfx?

I picked up Substance when it was on sale over the holidays, but I’m finding it a bit daunting and I can’t seem to find any learning material that isn’t broad across the board teaching for texturing. I’ve heard people rave about its procedural texture generation as much as they do about Houdini for sims, but I don’t know what nodes to use or which ones are sort of “every day use” like I do with Photoshop’s filters.

I even had an outline for a tutorial written down somewhere but you know, busy life gets in the way. Good to know that there’s interest :slight_smile:

Nodes that I use a LOT are shapes, gradients (even made my own custom: https://share.allegorithmic.com/libraries/2401), blend, histogram scan, safe transform (I even have mouse shortcuts for those), all the sorts of blur. The Pixel Processor is pretty much a pixel shader, sometimes I quickly prototype ideas in substance before carrying it over to my projects.

The FX-Map node is good for creating custom noises and patterns but it’s internals are very confusing at first, so maybe leave that for later. Sometimes you can get away by using a splatter or tile generator with random position. There’s a good enough library of noises and patterns.

I could take a moment during the weekend, grab a few files of mine, clean up and send to you so you can pick it apart if that sounds good!

That’d be awesome! All the stuff I’ve found is like “here’s a 2 hour tutorial on texturing a spaceship”

Alright, regarding the normal generation, I think I might be onto something. Did a lil prototype inside substance and will try it out in unreal tomorrow:

7 Likes

Ok, I got something. Beware 7mb GIF:
https://dl.dropboxusercontent.com/u/10717062/Polycount/FX_Liquid_01.gif

Shader network - It’s an unlit translucent material, everything is fake, physically incorrect

All the files are in a lil pack here:

19 Likes

Wow! Nicely done.
You are ninja.

1 Like

Legendary! Thanks for all the help everyone! :smiley:

1 Like

Hi Matt, how do you do it with ddx and ddy? I never played with that before and trtied appending them together, while it looks convincing in some cases, if you drive the intensity up you start to notice that the normals are pointing in the wrong direction. It did get super aliasy as you mentioned so I had to fade them over distance.

1 Like

Dude I didn’t know about fine! That’s awesome!

1 Like

@Bruno Any mipping issue from what you could see?

ddx(variable)

Variable can be anythingv

As long as it’s a float!

Well, that’s not true, but its rare that derivatives will give you anything useful when used with integers or bools since the derivative will be quantized too, and feeding in a uniform or constant may return junk data in some cases.

Hey Bruno,

This is incredible. I was wondering if you still had your files available? I would love to have a peak if possible.

Still available, but the link was broken since Dropbox eliminated the public folder. Updated the post with the correct link!

2 Likes