Does VFX Graph improve GPU performance on Quest 2 vs Shuriken

Hi everyone :wave:
Short version: If I feed the VFX graph the position and velocity of every particle in every update manually, would this still improve GPU performance vs running it on the Shuriken system?

I am a Cognitive Science Master and for my thesis I am working on a project that uses only particles and forces in Unity (related to Group VR experiences can produce ego attenuation and connectedness comparable to psychedelics | Scientific Reports). The project needs to run smoothly on a Quest 2 and I am running into GPU limitations when I use the Shuriken particle system with transparent particles (unfortunately they need to be transparent for things to look appropriately). I am emiting and moving the particles manually by script, applying attraction forces to multiple moving points in space and computing collisions between particles. Using the IJobParticleSystem methods I managed to make it CPU performant, but GPU is limiting the amount of particles I can display while keeping FPS up (about 1k particles seems to be the limit if they are all on screen). Since I compute all movement manually anyway, could I simply feed the position and velocity of each particle to the update block of VFX graph and still expect GPU performance improvements? I have never used the VFX graph, but from what I have seen I could use attributes to feed it all the particle positions and velocities I compute in my script, correct? My particles never die, I spawn them once at start and then I keep them alive infinitely, therefore I can always access each particle by its index that I safe when I emit them manually via script at the start.
I am happy to provide more information to anyone interested :slight_smile:

1 Like

VFX Graph is not good for Quest, I limited myself to only shuriken as it’s only working on CPU while VFX Graph is only GPU, so I guess it will make things worse.

Solution to be how to reduce overdraw, so maybe you can make those transparent particles just masked with dithering, it would help a lot already. Or try to reduce places where they overlap each other.

In RealtimeVFX discord there is one huge post about Quest optimalization. If you write there to me I can link you there

Thank your for your response, I found this link to your discord but the invite expired, could you please direct me to the correct one? Discord server

nevermind I found the active link

If you have the same name in discord then I’ve pinged you :smiley:

When creating VFX for the Quest 2, it is generally better to utilize the GPU over the CPU for better performance. The Quest 2 has a powerful GPU, which is designed to handle intensive graphical processing tasks.

The CPU is responsible for handling non-graphical tasks, such as game logic and physics calculations. While the CPU is important for the overall performance of a game, it is not as crucial for VFX processing as the GPU.

By utilizing the GPU, you can take advantage of its parallel processing capabilities to handle complex VFX calculations more efficiently. This will result in better performance and smoother gameplay on the Quest 2.

VFX Graph is GPU-accelerated. This means that the visual effects created with VFX Graph can take advantage of the power of the GPU to render more particles and more complex visual effects in real-time, which can result in higher performance and smoother gameplay on devices like the Quest 2.

It’s correct for platforms like PC/consoles, though not for Quest/mobiles. If your hardware doesn’t support any good GPU then even tho GPU particles are overall more optimal than CPU ones, they won’t work well on this device.

And Quest works on mobile hardware, so it’s no go

Oh, my bad, I thought it was for PC since Quest 2 can be connected to PC, I rarely think of VFX in Mobile games :sweat_smile:
but if that is so then you’re correct.

Ye, Quest can be linked to PC, though it’s one of those unpleasant devices where you have to include that probably most of the people play without plugging it in :stuck_out_tongue: