Let’s say i already built my fancy shader/material, with some texture panning going on (e.g. tornado effect, ball of fire etc.) and i want to put it to good use by applying it to a mesh.
Which of these three methods do you use when building your effects with meshes?
Animating the mesh inside 3d app, export animated fbx, use animated mesh inside engine
Export mesh without animation, animate inside engine (via anim editor or script)
Export mesh without animation, use mesh as mesh particles, animate through particle effect properties inside engine
For complex animation i would always use the first method, but what if i only need subtle things like scaling, and some random velocities? I love using mesh particles, because they have so much randomness and i can easily adjusting them. Is there a performance drawback when using them compared to the other methods?
Everytime I can I’ll try to build my effects using just FX editor/animate in engine. This approach is far more efficient for me and I don’t really need any source files & exporting fuss. It is also way easier to adjust animations in engine with the rest of the vfx sequence than trying to match that in 3d app.
Other thing to consider is memory. Since I’m currently creating games for mobile market I’ll tend to stay away from baked animations as much as I can.
Generaly speaking - if something can be done with rotation/position/scale - I’ll do it in engine. If there is complex movement or I need something very specific then I might consider animating in 3d app.
There are also vertex shaders, morph targets and flipbooks which can satisfy your needs - all depends on what you need, what you know and if it’ll run on your target platform
There are times where I animate a mesh, export it as a vertex animation, and let that be controlled by cascade.
Unless you go the skeletal mesh way, neither of them have any real heavy impacts mention-able unless you go the tessellation route with a few dozen hundreds of instructions in the shader and spam them in a mesh emitter.
Simple scaling/random velocity/rotation etc is probably cheapest to do in cascade, bending, twisting could be done with some fairly simple math inside the shader, though a vertex animation texture isnt that heavy either (unless you are animating 60+k vertices over a very long period of time)
Things that are notable though are that vertex animation/vertex manipulation in the shader affect the normals/normal map so thats always something you need to think about.
Skelmesh animations are probably the most performant of the mesh/mesh particles.
Additionally, keep in mind that each mesh emitted is one draw-call, so spawning dozens of meshes --even in the same emitter- is quite impact.
Just like anything else, optimization is key, regardless which route you are taking.
my#1 goal is animating simple meshes in the native particle system so FX artists groc results & there is less node overhead
2 If needed I’ll animate native to the engine, and remove particle use for reasons above (but 1&2 when needed)
3 in limited circumstances animation from fbx, however it requires the other tool (or processing) for edits so everyone needs that tool to contribute and the pipeline can be slower.
4 last resort is scripts that usually need some degree of code review and add more processing between artists/team with tech debt.
I’d say it depends on the effect, and how many times you see it. If it’s a one-off then maybe go animation inside your 3d package. If you see it multiple times, try your best to animate it in the engine, as you can give different values to the mesh particles. I don’t know a way to get randomization out of an imported anim, and if you see it more than once you will 100% notice the repetition. The downside is how complex the animation is. Some things are just too time consuming to animate in your engine (Trust me, I know) and you would be better off making your effect a bit more shader heavy and add in the randomization through the material.