Making Phantom damage system, need some help

Hi guys, I am relatively new to VFX and Unity in general (about 1 year), but I have a weird obsession to make more advanced VFX stuff. In my game, there is a humanoid phantom character and I decided to make really good looking damage system for him.

When some spells or bullets hit him, then left behind holes on character’s body. And here is the video (wood texture for better understanding):

So, I have a bunch of problems:

  • I don’t really know how to send an array of world position in the shader using shader forge. It’s easy to send one vector4 of damage sphere, but it would be multiple hits and holes on character’s body.

  • Is is possible to somehow emit particles from intersections?

  • Is there a better way to render backface of damage sphere? Currently, I’m rendering character’s backface texture with the second camera. Then using this texture as a mask for backface sphere shader. This also leads to some bugs when a model has intersected geometry, like belts and other small details.

  • The character is moving, but spheres locked in place and attached to bones. This can lead to some inaccuracies. I thought about translating hits to copy of a character that stands still. Then somehow translating entire material back to the character

1 Like

Try reading Valve publication on Left4Dead 2 wounds system. They solved most of your problems
http://www.valvesoftware.com/publications/2010/gdc2010_vlachos_l4d2wounds.pdf

1 Like

Thanks, sadly this is not what I was looking for. I need to fill holes by code using backface of a sphere, not making some prefab wounds. And wounds on phantom suppose to shrink down over time, so the fill area should decrease too.

But maybe I can make wound model and fill her with many small cube particles in Blender, making some kind of fake volume. Then use this model for filling holes. There will be a lot of tris thou, but in this case, maybe I can even emit particles from intersections.

You probably want to just be using two sided rendering and sphere tracing rather that doing this in two passes, especially if you want to do multiple points.

Two sided rendering is fairly trivial in Shader Forge, there’s a convenient Face Sign node that you can use with an If node to switch between doing the cutout surface rendering and doing the interior sphere rendering. Sphere tracing is also relatively simple, there’s lots of examples of doing sphere ray intersections, and while most of them assume you just care about the front face of the sphere, getting the back face is going to be as “simple” as flipping a sign someplace.

https://www.scratchapixel.com/lessons/3d-basic-rendering/minimal-ray-tracer-rendering-simple-shapes/ray-sphere-intersection

Basically sphere tracing is done by doing finding the closest point on a ray to an arbitrary point in space, then calculating the width of the sphere along that intersection, and subtracting half of that width from the closest point on the ray. For your case you would just add half of the width instead and voilà!

Passing an array of points is harder to do though, just because Shader Forge doesn’t support it and Joachim Holmer doesn’t seem to be working on Shader Forge as much these days (he’s busy working on Budget Cuts). You would probably need to hack around Shader Forge’s limitations here using a Code node and maybe even a basic external .cginc so you can define a float4 array of sphere positions and radii and loop through the spheres efficiently. Once you start getting into that territory it might start making sense to drop using Shader Forge completely and work on the vertex fragment shader directly just since writing significant amounts of shader code in a little text box can be painful.

Another option is if you don’t want to do sphere tracing, you might be able to do it with some clever use of stencils.

1 Like

Thanks for such detailed answer. Is it possible to do sphere ray tracing in Shader Forge only? I don’t really know how to code shaders without it. Yes, I know this is wrong to study shaders only with visual scripting tools, already started to read some tutorials. :slight_smile:

And I don’t fully understand how sphere tracing should be used in my case. You suggest to render only backface of a sphere and only when sphere intersects character’s mesh?

Still trying to do this in a wrong way with a big number of limitations (character’s model needs to be super smooth for mask, with no intersected parts). And another problem - if two or more sphere on a line of sight, you can see through.

Sphere tracing is totally possible with Shader Forge, you would just need to translate the code into nodes. The geometric solution should be quite fast to calculate on a GPU as the meat of it is two dot products and a square root to find the back face.

However the issue of seeing though models is a much harder problem to solve.

There’s a couple of ways to deal with it, the first one of course is don’t do anything and just accept the visual artifacts. The other common way to deal with it is with interior models like what they do for Left 4 Dead games. Another, more complicated method, is to not do your holes with shaders but actually cut the geometry. Several games have done this, with a few releasing papers on it, though most do straight cuts only, but this is far more difficult than it sounds and I wouldn’t recommend it unless you consider yourself a seasoned programmer.

Again some very tricky use of stencils might allow you to do what you want with fewer obvious issues, but I think you’d need to do something where you render out the depth of both the front and back surface of the character geometry, without the cutouts, to a render texture, then render out either real sphere geometry or raytraced spheres that uses those depth textures to clip out, then render the actual character after that. The finer details of that will be complicated to work out, but it should be possible to do the effect with out any of the issues you’re seeing.