Displacement Nightmares


#1

Hey folks. Currently building a project for a client which involves shapes and creatures pressing and moving against a membrane. I’ve tried using shrink wrap, cloth simulations and blend shapes to achieve the effect, but nothing seems to give me the same freedom of movement as rendering a depth pass sequence of the objects moving and using that to drive a displacement / warp modifier.

Unfortunately, it’s still a bit rough. Objects extruding far out of the surface look stretched and glitchy with narrow triangles. Additionally, the mesh is using 64,800 triangles, which is about the maximum I can use in Unity Realtime per object - when I bake it out to an Alembic file and import into Unity, performance is NOT great (especially for a VR project).

Can anybody think of a prettier / more efficient way to achieve this effect? I don’t think parallax maps or animating normal maps will help too much here - really need some clear dimensionality. I think I can possibly get away with a projector with normal map for some of the smaller items moving across the wall, but I’d really like to find a prettier, more efficient solution for everything else.

Any help, much appreciated!


#2

Heads up, I’ve never done anything like this, I’m just giving some sugestions based with your explanation as ground truth. There’s a few things that I can think of to try out.

If you can somehow blur out the animation, you would need less vertices to convey the idea off things being pressed through cloth, and it might look more realistic as well since cloth typically doesn’t follow the contours of an object this well.

This wouldn’t solve your preformance problem, but you could always make your cloth out of multiple high poly meshes.

You could tesselate then offset the vertices using a heightmap generated by these figures, some of the animations could be thone simply by animation the mask in 2d, others might require sequence textures to be rendered out.

Don’t underestimate normal maps, unless you are looking directly from the side, it is really difficult to tell (even in vr) that the mesh isn’t actually changing. Again, blurring out the effect of the normal map for smaller elements could really help sell this as higher fidelity then it is.

Anyway, that’s some ideas to get you thinking. Good luck! :smiley:


#3

@Wyvery - Hey, thanks so much for the ideas. =D

Yeah, the detail level on the cloth is a tricky one. I agree that there’ll be less clarity of features on the object behind the membrane, but the membrane adds some details of its own to the image…! Here are some of the sample images I was sent on this project…

Hopefully my client will agree to a slightly less detailed visual…!

Thanks for the note on normal maps, we’ll get on it. =D Much appreciated!


#4

Any chance you can get realtime tesselation somehow? That would help avoid the narrow stretched triangles. I haven’t used Unity in a while so don’t know if it’s available…


#5

I have used alembic with 300k triangles and 1500 frames in VR. You can see it here: most of the ship debris is an alembic cache. That was Cryengine though, but I don’t see why Unity would be any worse? perhaps it is :frowning: Alembic was super fast when the whole cache was loaded into memory (300mb)
If you have a vertex limit on the object you can always break it down into smaller segments? say you divide the plane into 4 smaller chunks, then you suddenly can have 259.200 triangles :slight_smile:

Another way could be to use a render target and in-engine displacement together with tesselation and normal generation, then it would even work in realtime!


#6

@Urgaffel - We’ll look into it. Unity has realtime tesselation, but I’m not 100% certain my shader coding skills will get me there. Thank you, though, that’s solid advice!

@Ludvig - Huh. Well, Unity’s usual tri limit per mesh is 65,000 . We’re definitely looking at breaking the work into multiple planes, but the nice thing with the bigger plane is that we can have ‘objects’ cross from one side of the mesh to the other. That said, as mentioned above, we’re looking into doing that with normalmaps now…

It’d be lovely if we could get this working in realtime! Not sure if my shader coding skills will be up to the job, but I’m investigating that…

Thanks for the advice! =D (Btw, great-looking VR demo!)


#7

If you have the vertex offsetting in world space instead of local, you should be able to have objects move between the separate sheets without being able to tell it’s multiple meshes.