VFX for VR. What's still working and what does not?

Something to look forward to:

Niagara will come with additional features that will improve sprite/billboard usage in VR.

Thin smoke still works. Thick smoke near your face fares a bit worse. If you use Unreal and disconnect roll, you can still get away with a lot.

My biggest DOH moment was in unity where I had a bunch of stretched planes on the ground to indicate some groundfog. That didn’t work at all. They flop back and forth as you move around.

The more static something flat is, the worse it’ll be.

1 Like

do you mean the typical sorting-issues we often see on particles?

No, particles rotating back and forth in two axis. It’s very noticable if you have intersection with the ground as well.

1 Like

Hey! I just shipped Farpoint on PSVR as the VFX artist so I have some experience with it :smiley:

Yeah so sprites work well up until they are about 5 - 10 meters from the camera. You definitely need to use camera fade to alpha them out as they get closer.

Always use soft particles for sprites when they clip into surfaces, otherwise you will see the hard line of the sprite intersection.

I used meshes whenever I could to help add depth to the effect. For example, when a spider is killed I used a liquid shaped static mesh along with some sprites to add volume. Or for an explosion, I had a 6 plane mesh that I used the main explosion flipbook on and it worked pretty well to add some depth.

Another useful practice was adding as many small GPU particles as I could. They feel great in VR and help add depth to the effect. These can be sparks, blood drops, small rocks, snow, etc. It feels noisy on a monitor but for some reason it feels awesome in VR.

And as Partikel says, the camera facing rotation of sprites can look really bad in VR. In Unreal, the best settings I found to solve this was to use PSA Facing Camera Position and check on Remove HMD Roll, which they just added.

PSVR had a ton of hardware limitations so I had to be very careful with overdraw, memory usage, render targets (refraction), cpu tick on particle systems, etc. Performance is a huge problem in VR as you can imagine. The number one helpful thing I can recommend is LODing your particle systems so when they go off near you, reduce the hell out of the overdraw. Overdraw will destroy your framerate in VR. I usually had 4 Lods per Explosion where the nearest I would turn off any large sprite. The second I would use a mesh for the main sprite to add depth. The 3rd LOD would be mostly sprites because it is further away. And maybe a 4th for very far away to get rid of small stuff like sparks. This thought process was applied to every particle emitter in the game.

This is just a quick rundown but feel free to ask me any questions.

19 Likes

Wow thank you, this was a ton of good tips! :slight_smile: Great!

seconded. thanks :slight_smile:
nice tip on the 4 lod thing :slight_smile:

That’s fantastic! Rolling billboard particles has been a huge peeve of mine for a while now, and I’m curious exactly what they did. I’ve been rolling (pun not intended) my own anti-roll code for Unity for the last few projects. For Dino I’m using almost entirely mesh based particle effects, but the style of VFX for that game allows for that. Sorting transparent meshes is a huge pain for VR and I did some experimentation with stochastic noise based transparency, but it’s just too noisy without TAA in most situations. I combined it with alpha to coverage and use it for cases when I need to be able to fade out objects quickly, but not obviously change in lighting.

Are those 6 parallel planes, or intersecting, or spread out like random particles? Seems like a killer for ROPs / overdraw a little offset mapping could do for cheaper if they’re parallel planes, especially on the PS4. I’ve yet to be able to fully saturate the PS4’s pixel shading capability, but I do hit bandwidth and VGPR limitations, and general ROP limitations. It’s mainly because I’m terrible at understanding how to reduce VGPR usage though.

A technique I like using for billboards is pushing the plane towards the viewer by the particle radius (in clip space so the particle doesn’t get bigger on screen!) and adjusting the soft particle depth reference based on either a sphere-ish shape (smoothstep the squared UV distance from center) or the alpha opacity as height. Works well with non-VR too as a general way to move billboard particles off of the “center” of the effect and give them more volume. I didn’t get to do this as much as I wanted in Wayward Sky or Dino Frontier since Dino is still stuck in Unity 5.4 as the game crashes constantly in 5.5+ even after Unity spent the time to fix several bugs we exposed.

Generally I do stick to meshes when possible for anything that’s not very small or very soft. For Wayward Sky I did a lot more effects similar to Rime (which would look great in VR) with a lot of alpha to coverage erosion rather than alpha test, and a lot of fresnel fading spheres with cloudy noise alphas. Dino Frontier is almost entirely mesh based for all but the smallest of elements, but almost all of the effects are flatly colored tetrahedron or octahedron mesh particles.

By 6 plane mesh I meant 4 planes? Lol, anyways it looks like this.

Basically just used Fresnel to control Opacity of the planes when they start to align with the view so it never looks flat. Because this mesh is not camera facing, you can walk around it and it feels kinda 3d. Since you rarely got above an explosion, this technique worked quite well. Did not get a lot of time prove out new techniques for VR but now that Farpoint is out I will be hitting R&D really hard in the next few months. I can see the use for meshes more in stylized games but trying to stay more realistic the meshes get more complex. It is challenging to find ways to utilize them and stay within budget.

I definitely want to look into more offset stuff, works pretty good in VR. Unreal calls it bump offset. You are able to use sphere shape on your offset? How many samples did you need?

6 Likes

Ah yes, the fin style mesh. Popular with rendering beams and trees for ages. I still wonder if a shader based solution might be cheaper in some cases, plus you could ensure correct sorting between the planes if you aren’t doing additive blending on the explosions.

Just for simplicity I would probably do it as two mesh planes with a vertex shader setup to rotate the at two fixed world angles to blend between rather than using 4 quads and fading out by fresnel, should reduce overdraw with out increasing complexity significantly. Probably cheaper than even the fresnel.

I’ve thought about doing an explosion effect for VR by rendering out an explosion from 6 or 8 rotations and blending between them using a height map offset or velocity buffer-like setup. It should be possible to synthesize plausible inbetween frames with some appearance of depth. I’ve done something similar in the past using cubemaps and relief mapping for impostor rendering.

For something like an explosion it should be doable with a simplier blend and offset with out looking bad and way cheaper than relief mapping, and cheaper than tracing through a 3D texture.

3 Likes

To answer this part directly, the basic bump / parallax offset is super cheap, but most of the time implementations don’t like a ton of offsetting because the effect breaks down quickly on glancing angles or “hard edges”. This can work out in favor for particles since they’re usually camera facing and soft shapes. There’s no concept of samples here since it’s not actually doing tracing, just lazily offsetting the UVs based on a single height map sample and the view angle. Basically it doesn’t deal with occlusion. Something like relief mapping or parallax occlusion mapping would be more accurate, but way more expensive as they do need multiple samples of the heightmap. However these techniques often have substantial aliasing artifacts unless you use a lot of samples which look ugly in VR, though might be fine with TAA. These techniques are also designed around hard surfaces, not soft shapes.

Basically the precision isn’t needed or really wanted, so the “smudge” of basic bump offsetting is all that’s needed … assuming it works. For the stylized of soft effects I’ve been doing I’ve not needed to use this technique, and what testing I did didn’t show a significant difference for that style so I haven’t actually used it in production.

3 Likes

A technique I like using for billboards is pushing the plane towards the viewer by the particle radius (in clip space so the particle doesn’t get bigger on screen!)

Love it! I’ve done similar things (for non VR) with a separate depthmap to fudge the Zclip. This allowed me to make things appear gradually through smoke even though it was a single quad. Haven’t had time to play with this for a while though…

I’m realizing a limitation of working as a by the hour freelancer on projects like this. I’m not able to noodle away at setting up custom solutions for VR, and I don’t have much weight behind me to get a coder to do it. So I’m pretty much stuck with what ships with unity. Unfortunate. I’m looking forward to all of this being standardized and efficiently implemented.

Also, +1 on adding floating embers on every effect :stuck_out_tongue: Feels so good to move through.

3 Likes

Awesome thanks for the tips! I’m working on a master explosion right now with all the elements I would ever need in the future. I will definitely try some of these techniques, and maybe Sony will let me share the result!

that would be awesome :slight_smile: always great when companies allow sharing knowledge and luckily we work in a field where this happens a lot :slight_smile:

1 Like

One thing to keep in mind if you work with PSVR and the game is having PS4 + PS4PRO support. PS4PRO can render at higher resolutions but the PS4PRO fill rate capabilities is not a lot better than the regular PS4, which means overdraw on the PS4PRO might cause framerate issues while being fine on the regular PS4.

TLDR; The hardware difference between PS4 and PS4PRO is not a uniform performance upscale and PS4PRO might actually make the VFX performance worse.

3 Likes

That’s a very good hint, thanks for this insight. I was expecting a linear performance boost :smiley: Well, it never gets boring with these computers, right? They always have something to throw between your legs :smiley:

So I worked together with the senior artist on the PS VR Launch title RIGS and it was quite the task to make sure everything was optimized to work not only for VR, but multiplayer as well. We had to make sure that maximum of 6 players fire the same effect at each other as the worst case scenario (usually the worst case was tested). Some tips and tricks we did were:

  • Creating one shader that I would normally do in two or even three in let’s say Unreal(we used Sony’s in-house tools and Maya for shaders), but have a vertex color switch between the elements. Overdraw was always on the table, so we had to think of that.

  • As @AlexB mention, have a lot of LODs, specially when close we’d fade out as much as we can, so that there’s almost no screen space taken.

  • When using sprites for something like smoke, we’d have one facing the camera and one flat on the ground to give a sense of depth. We did tests with 3D explosion and complex vertex shaders, but there wasn’t enough time to properly test it out(specially cause Maya could be quite difficult to work with sometimes).

  • Because emissive can cause problems in VR(worst one can be epilepsy), we’d have very low levels of it and add big glows at the beginning of explosions to simulate the flash.

These are some I remember at the moment, all credit is to the senior artist Oli McDonald, I was pretty much copying his work style :slight_smile: We had some cool stuff in store for upcoming patches, sad that the studio shut down.

1 Like

Both the PS4 and PS4 Pro have 32 ROPs (Raster Output Units), though the Pro runs these at a slightly higher clock rate. The result is the PS4 Pro is only about 15% faster than the PS4 when it comes to overdraw / pure resolution increase! They did double the shading cores, texture units and compute performance, which basically means you can do much more complicated shaders and compute … but it’s also why they put so much work into complicated shader / compute techniques to approximate 4k from not 4k resolutions, like checkerboard rendering or interlaced temporal or abusing MSAA coverage to “upres” content more accurately.

For Dino Frontier, for the Pro we just increase the resolution scale by <15%, any more and we start loosing frames again.

Curiously, the Xbox One X (uhg) only has 32 ROPs too, but they’re clocked 30% faster than the PS4 Pro, and doubles the ROPs over the Xbox One or Xbox One S and significantly increases the clock rate over either, but it’s still not 4x the performance of an Xbox One. They’re making big claims about being able to run games at 4k native, but it won’t be true for all games. It’s getting to the point where doing a significant amount of rendering purely in compute or shaders instead of using geometry is making a lot more sense.

1 Like

Anyone have any resources for VFX in VR in Unity specifically?

I have a team of programming students I am working with and they are tasked with making a VR game (in Unity) that makes use of an external device that acts as a snow/hoverboard.

While VFX in general I can say I know a little / decent bit about…that amount becomes less and less when geared toward VR (which this thread covers some) and made use in Unity (I am more from the Unreal side).

No sure if it helps but at the unity forums there’s a whole area only about VR: https://forum.unity3d.com/forums/virtual-reality.80/

1 Like