The technique in that talk reminds me of the technique Guerilla used in Killzone 3 for the stealth effect, though the idea of using a normalized 2d vector is great. Means you just need a single RGB texture to do everything (plus the lookup texture for the edges, and the screen texture for the fresnel stuff).
I used that technique for MNC’s fodder taking damage over time to add scratches and pock marks to their surface. Super useful. Doing it with texture derivatives or multiple offset samples makes a bit more sense today as you’ve got a ton of ALU to spare, where the PS3 was a bit more restricted so a texture made more sense for Last of Us
True, now that I’m reading it again it’s starting to make sense. Thanks for pointing it out, really helped me figure it out I’ll try some stuff out and see if I can tweak this lil tool of mine!
@Partikel your method seems to work fine too, slightly different aspect. Eben’s normals have this film aspect to it, like a pizza dough with thick borders. But like Eben mentioned, they’re on screen for such a short amount of time that I believe that timing and shape are so crucial that shading could be just okay and it’d be great.
Hmm, I need to try that and see if the result is better.
Any idea of cost? Instinctively it feels more expensive to get the derivative than a subtract. Though, it might be safer as you don’t risk an extra texture call if you go to far from the originial pixel…
Time, I need more of it.
They are always pre calculated even if you don’t use them since that’s what the hardware uses to decide which mip map to choose. Can get a bit aliasy though
Yeah, people are always scared of derivatives, but they are nearly free. Certainly cheaper than resampling the texture 2 or 3 more times. Like @mattorialist mentioned it’s using data that’s already been calculated, a ddx or ddy is effectively just the cost of a single subtraction, but shared between 4 pixels. Even fwidth is just two subtraction and an add, again shared between 4 pixels.
For to reduce aliasing from using derivatives you can try using ddx_fine instead. Since ddx is calculated for each 2x2 pixel group, the value that ddx returns for all 4 of those pixels is the same. Using ddx_fine it is calculated for each 2x1 pixel group. There’s still a chance for aliasing, but it can help a lot in certain situations.
A bit off topic @Bruno, but do you have any tutes you could point me to for learning Substance as it pertains to vfx stuff? I don’t supposed you’d ever do a quick write up or video yourself on a quickstart guide to substance for fx work or texture generation for vfx?
I picked up Substance when it was on sale over the holidays, but I’m finding it a bit daunting and I can’t seem to find any learning material that isn’t broad across the board teaching for texturing. I’ve heard people rave about its procedural texture generation as much as they do about Houdini for sims, but I don’t know what nodes to use or which ones are sort of “every day use” like I do with Photoshop’s filters.
I even had an outline for a tutorial written down somewhere but you know, busy life gets in the way. Good to know that there’s interest
Nodes that I use a LOT are shapes, gradients (even made my own custom: https://share.allegorithmic.com/libraries/2401), blend, histogram scan, safe transform (I even have mouse shortcuts for those), all the sorts of blur. The Pixel Processor is pretty much a pixel shader, sometimes I quickly prototype ideas in substance before carrying it over to my projects.
The FX-Map node is good for creating custom noises and patterns but it’s internals are very confusing at first, so maybe leave that for later. Sometimes you can get away by using a splatter or tile generator with random position. There’s a good enough library of noises and patterns.
I could take a moment during the weekend, grab a few files of mine, clean up and send to you so you can pick it apart if that sounds good!
Alright, regarding the normal generation, I think I might be onto something. Did a lil prototype inside substance and will try it out in unreal tomorrow:
Hi Matt, how do you do it with ddx and ddy? I never played with that before and trtied appending them together, while it looks convincing in some cases, if you drive the intensity up you start to notice that the normals are pointing in the wrong direction. It did get super aliasy as you mentioned so I had to fade them over distance.
Well, that’s not true, but its rare that derivatives will give you anything useful when used with integers or bools since the derivative will be quantized too, and feeding in a uniform or constant may return junk data in some cases.