How to create intersections between transparent 3d models in Unity?

Hi! I’m having trouble intersecting two quads or 3d models in unity. If the material is set to Opaque, there is no problem at all. Look at pic1 for the result.

However, just set the same material to transparent and the weird stuff starts happening: pic2.

Always either one of the squares is completely in front of the other. It depends on the distance between them and the camera. I’ve learned to tweak it so I can choose which material is painted over (for example decreasing the render queue number of the lower) but I’m not able to obtain a cut between the two. I’m almost sure this has something to do with how the z-buffer works but I couldn’t find any good workaround.

There must be a way of doing this, since it is a very common feature in video games, specially for vfx (or just think of rainbow circuit in Mario Kart!). See what happens when trying to create a simple vfx of two cylinders one inside the other: pic3.

In this case, the inner cylinder is showing on top of the outer one and also the back face is doing weird things when rendered (look at the red arrow).

You can easily reproduce this with two quads in a new scene. Just create a new material and set it to transparent. I’m using the standard render pipeline but I think this also happens in the URP.

Thanks for the help in advance!

I don’t know of any definitive solution, you always have to keep this in mind when creating effects that use translucent materials since they usually don’t write to the depth buffer. In the example at the right you can make it work with the render queue priority, using 2 different meshes for the outer ring: one for the outer side with a lower priority queue and another for the inside with a higher priority queue.

I don’t know if the solution is simple or not but I don’t think they are doing the Mario Kart rainbow circuit by chunks, hehe. Maybe is just a limitation of Unity’s shader-graph?

The same problem appears in Unreal Engine and other 3D engines. Rather than a limitation in the Unity’s shader graph, is a limitation in the forward rendering pipeline. Each translucent triangle being rendered lacks information about the other translucent triangles being overlapped and that’s the source of the problem. You can see the same being explained here, and to quote the conclusion:

You can’t simply render translucent objects in any order without special consideration. If you have enough translucent surfaces moving around in a sufficiently complex manner, you will find it very hard to avoid errors with acceptable realtime algorithms.
.
It’s largely a matter of what you are prepared to tolerate and what you know a priori about your scene content.

So, like in my previous example, you usually have to find ways of using the sorting tools the engine provides for you to decide in which order are the different meshes going to be drawn. These are usually settings like sorting by distance to the camera or sorting by an explicit priority value.

The more you know about what your scene is going to be like and the narrower the scope is (is the camera always looking downlike in Diablo or is it a free camera) the more assumptions you can make about how the meshes should be sorted and the easier will be to find a setup that sorts correctly.

Regarding the rainbow circuit example, knowing how the game works they can probably make some assumptions about how the elements are going to overlap. As an example, in this screenshot you can see how the bright edges of the road are being drawn over the translucent glider. Meanwhile, the glowing booster ramps are being drawn behind the glider material. I’m guessing that’s because while the ramp material is opaque, the bright edge material on the roads is translucent.
imagen

They probably setup their sorting algorithms in a way that works fine in 99% of situations, accepting that there are going to be some artifacts in niche cases like this one. Of course I’m just guessing since I didn’t work on the game, but you can find these small things in plenty of AAA games.

PS: Just as a disclaimer, all I’m talking about is translucency in a forward rendering pipeline like the one being used by game engines today. There are some Order-independent transparency techniques about which I know absolutely nothing about and I don’t think I’ve ever seen being used, probably because they are too expensive (?).

Other than that you can fake transparency with dithering like this, which also has its limitations.

Then, does it mean that, for example, the hair of characters of recent year’s videogames are all made not with transparent textures but with alpha clipped high definition textures? Otherwise the tufts would flicker with camera rotation all the time. But usually they look like transparent textures, not solid ones. Any ideas about that?

There are various methods to render hair, you can even check some unity examples, but yes, it is common to render opaque hair strands or even cards with alpha clip and the main reason is not just sorting, but also the fact that pixels would not be discarded what could result in huge overdraw.

1 Like