Actually, I think I figured it out. For some reason RGB cannot be added to UV Coordinates and plugged in. However, the R channel is accepted and will offset the UV’s by sampling Shuriken’s initial color gray-scale.
Here’s an image for anyone else having a similar issue.
**NOTE: Shader Forge seems to have issues- You can bypass the Add and just plug the R directly into the UV Distance Slot if the combination doesn’t work (image 2). ShaderForge act’s weird sometimes (possibly buggy).
–
A - Particle with No rotation and No Offset
B - Particle with Offset and No Rotation
C - Particle with Rotation but No Offset
D - Particle with Rotation and Offset
Here’s a neat little thing I wanted to share as well. That GDC talk was really cool and inspiring!
The fire is composed of 4 elements: Additive Mode (ShaderForge), Random Rotation, Panning Textures (with Offset) and Dissolve. I love how cheap and complex it looks!
I’ll post a breakdown of my Shader if anyone wants to see that. The smoke and aura are super basic.
Of course! Here’s a Tutorial on what I did. It’s pretty basic but works nicely.
The Two Textures are Hand-painted Fire Texture and Render Cloud Texture. The Cloud texture pans at about 0.1 speed on the V Axis. The “Intensity” is 12.
I see. I did remember folks saying something about that. It looks like Src Alpha + One Minus Src Alpha is for Alpha Blended.
I personally don’t understand it very well, but I’m learning through trial and error.
I think the brightness is coming from my “Intensity” value in my shader set-up.
It’s the exponent for the power node. So the value that’s being subtracted is getting higher which is somehow brightening the emissive. The math doesn’t make sense but you’re seeing the result.
If you’ve got a better set-up it’d be cool to see that! I’d rather know the correct way than have dumb-luck haha.
could’t use color_over_lifetime with your inputs because you have vertex.r offsetting custom.x, I’m not sure if you’ll like my results
I limited Z output to X curve to offset … I think it worked
Diablo/blizz takes this further by having a 2nd and even 3rd texture multiplying to get hypnotic interference
I set a 2nd noise texture with W output to Y
then used Random between two Curves in the custom data
in any case the blend mode is working (but I forgot alpha*2 in the above image)
Ah, I see. Could you add a “Dissolve” effect to the shader? I’m having issues trying to add that to your network. I understand what’s going with your set-up and it’d be cool to add a nice dissolve.
However, the off-set from Custom Data doesn’t seem to work for off-setting UV’s. It only seems to register curves (which will create motion instead of offsetting randomly). That’s my issue right now.
Overall, I’d like to control the offset AND dissolve with two different curves/random values. More specifically- a dissolve pattern eating away at the texture (like I have it now) instead of simple subtraction.
The offset can be achieved by setting the custom streams from curves into rand-between 0 & 1
As for dissolve it seems outside my skillset. I’m not sure what you mean, is it linear subtraction like
o.color = tex - [1 - v.color] ?
As a followup i grabbed textures from the tech art presentation and tried to copy lacuni fire and failed…i dont know if blendAdd is used for that. If so im baffled
How do each of these four Shader’s function? This is what I understood:
vfx_fire_01 - This looks like a general shader (no dissolve) that relies on alpha. It’s got texture offset though. vfx_fire_02 - Similar but less complicated. vfx_fire_03 - This looks like it’s using RGB colors to eat away at the emissive while fading out as well. This works if RGB goes from light to dark values (red to black). vfx_fire_04 - Similar but less complicated version of 03. Again, Alpha/RGB eats away and fades at the same time.
Ideally, my fire would work like this-
Fade in (alpha) > Change colors (orange to dark red/black) > Dissolve out (no Alpha fade).
Here’s an image of what I mean by dissolving with texture. Imagine if you wanted a custom texture to dissolve the texture instead of relying on the main texture’s opacity.
How would you dissolve your texture (not using opacity clip; too hard edged) if the graphic was solid white/black? Wouldn’t it just abruptly disappear once it began subtracting?
I sorta get this result with my additive setup but it isn’t perfect (and I want AddMul to avoid white blow-out).
Does anyone know how to add this type of effect to an AddMultiply Shader in ShaderForge?
Also, Alpha blended with a bright emissive looks like AddMul but isn’t. It’s never blown out, but it lacks additive properties. It seems like the choice is between Alpha Blended and Additive.
01 is multiplication (color * vertex color) with BlendAdd mode
02 is multiplication (color * vertex color) with Alpha blend mode
03 is subtraction (color - [vertex.color - 1]) BlendAdd mode
04 is subtraction (color - [vertex.color - 1]) Alpha Mode
[quote=“colossaladvent, post:36, topic:989”]
custom texture to dissolve the texture instead of relying on the main texture’s opacity.
[/quote]I have no idea how to set that up, sorry; this was the limit of my technical skills at the moment; I’d need to see someone else do it to grasp the network [my background is drawing and painting]
Using Unity 2018 and the shaderGraph I put this together because I wanted alpha dissolve too + vertex color, addBlend and a single color swatch with ‘overlay’ function for remapping hue
It’ll be like necro-ing a thread but if you need some simple randomization without giving away all other properties you use in a particle system, you can try using custom vertex streams.
I used to animate material properties like crazy until I found out custom vertex streams. Personally, I’d like to avoid using animators or scripts and pack everything inside a particle system and a material whenever I can.
You can also use custom data (instead of random) and use CURVES (like in any particle property) to easily “animate” a value over a particle’s lifetime as well!