Andy's VFX Sketchbook. Oculus VR, hands tracking, Unreal and VAT

:comet: thumbnail :comet:

atract_emit


Hey great RTVFX community!

I’m Andy and I am happy to join the party here.

Let me show you first the effect I am working on, after which, those who are interested can find the sequence breakdown and a couple of sentences about me.

Cut when pieces are emitted from the hand:
pillar_mk1_emission

Pillar assembly:
pillar_buildup

The full sequence in good quality

Here is the general overview of the pipeline:

  1. Using Oculus Quest 2 and its hand tracking feature (no controllers), hands and fingers movements feel quite natural
  2. Oculus’ hand skeleton is a bit different from the standard mannequin in Unreal, so I have created a custom skeletal mesh in Houdini and tweaked the existing inverse kinematics solution (UBIKSolver) to work with it
  3. Pillar assembly is a VAT (vertex animation texture) I made in Houdini. Internally it is RBD (rigid body dynamics) sim with custom trajectory solvers and timing management
  4. Stream of emitted pieces is driven by Niagara, because I want to fully control its trajectory - whenever I move or rotate the hand, newly spawned particles follow the updated path
  5. Stream emission is attached to the event of opening the hand. I’ve added some infra to know when a finger is opened/closed so I could plug and swap effects with little overhead

The next learning and building plans are:

  1. Landing pillar on the floor, accompanied by ground destruction
  2. Adding secondary effects for pillar construction, like dust coming out when large pieces are put into place
  3. Adding materials around the Ice theme
  4. Adding proper accumulation effects around hands and more nuances to pieces casting
  5. Adding sound effects, want to explore the use of collision data from Houdini to generate a believable sound layer
  6. And to continue having fun with this :wink:

And here is my story in short - I’ve been having a great journey building startups and doing things like data engineering and machine learning, when got the chance to try VR for the first time (the game was Superhot VR) and……

I was hooked and knew what I wanted to work from there on.

A couple of years forward from that, and I am happy to start sharing the things I am learning and building.


Thanks to those of you who’ve got that far, and please ask the questions if you’d want to know more.

5 Likes

Added pieces accumulation around the hands, followed by emission:

atract_emit

The full sequence in good quality

The effect has 2 main components:

  • Spline which creates the path for particles. Continuously updated to reflect current hands position and orientation.

  • Particles are managed by the Niagara system. I wanted to have the look of the same pieces spawned, accumulated, and then cast, so ended up building a single emitter, in which particles transition between those 3 states when certain criteria are met.

    I also wanted particles to stick to the hand when they are around it, but fly freely on the trajectory when they are away from the hand. For this, the position is interpolated between current and cached splines. As there are no arrays of splines in Niagara where I could store past spline trajectories, I am caching those into 1d array within emitter.

    Plus a good number of curves to drive behavior and variability, and other logic, like forward vector calculation - I ended up with a fairly big emitter. It is maintainable and easy to art direct, but I think is around the upper limit in complexity.

    Another thing - conditions and loops in Custom HLSL are better to be avoided (if possible), as advised by Unreal - I cannot count how many times the editor was crashing during work with an array in loops, but that happens only during editing Niagara modules, not in-game.


Dear RTVFX community - how would you approach building this kind of effect? Happy to hear your opinions or feedback.

2 Likes