Our latest featured post is from Ready At Dawn, who has been doing some cool VR work for Oculus. Some of my favorite gaming memories are in VR games, few though they may be. Anyone here have a lot of fun working on VR? We hear a lot about the technical limitations of framerate, but it’s gotta be a blast experiencing your work as you’re immersed in it.
Here’s a shot from Ready at Dawn’s latest VR game, Lone Echo II:
Currently working on VR game, the first thing ~the biggest thing actually~ is the optimisation. Sometimes, it can be a terrible pain, when you are thinking about something, but quickly have to find an other way to do it, more performance friendly… but it’s also the good point, to always thinking about an other way.
It have been very informative, as a young VFX artist, about the importance of optimisation, profiling, and re-usability about everything you do, as shader, texture, and always thinking about massive overdraw ! If you have some junior or intern on your company, make him look his effect on a VR device, at 25 FPS, and you’ll be sure he will immediatly get the idea about optimisation process
Also, one thing pretty difficult to work in, is the way you could influence where the player has to look, because he could eventually look all over the scene, and miss the important thing, like your beautiful explosion ! It’s the matter of different job, as Game Design, Sound Design and VFX, to work together and be aware of this constraint, but you can already thinking about how your effect can drive the look all over the scene ; as the screen you post I assume.
Chronos and Dead and Buried were so fun to work on, that I enjoyed playing them even after I was done working on them. Having the fixed Resident Evil-style camera that locked the player in place and allowed them to look around was a really great element. Our environment artists and level designers used it to great effect to make some awesome views and perspectives. Dead and Buried was just plain fun
Currently working on a VR title. First day of work, I was told I cannot use alpha. yeah…alpha…let that sink in
Thank God for Houdini
I think the hardest part is the constant on and off of the headset during the day. I think we all like to get into a groove with some music or a podcast, but it’s hard to do when you need to put the headset on to test often.
No alpha as in no alpha blending, or no alpha in textures, or both? Did they say no alpha testing too, because that’s way more expensive than alpha blending. And if you can do additive effects, it makes even less sense since on modern GPUs (desktop, console, and mobile) alpha blending is the same cost as additive.
Been doing VR effects for 5 years now across the full range of devices, from first gen Gear VR, through Daydream, Quest, PSVR, and high end PC, I use alpha for all of them.
It could be several tricks, and some not relative to VFX : sound, like tick tick tick before explosion, someone telling “OMG look at that”
I assume that’s not the most sexy way, but at least the easiest or the cheapest…
But it’s a team effort, to build the environment a certain way, the level design also.
For a personnal view, on a case I worked on an explosion. With a couple of test, I added in the timing a quick light with a huge radius, in order to warn people that something just occur right behind you, and you should take a look of what’s going on ! As people are quite curious and aware about the environment, they usually don’t miss it. And usually, the most interresting part on an explosion is usually the big mushroom after, and not really the bright flash !
On a most general way, I guess the anticipation part is the most important part to force the look somewhere. Ffor me it was just a light, but you can add some movement all other the scene to show a direction, like when you are at the beach and see the sea vanished before a wave end on the sand, the movement warn you about the force that will occur after (i don’t really know if my exemple is clear )
Hope I answered some of your questions, but I guess every project can have it’s own way to influence the players view, as you have to be smart to find your way to do it !
Using sounds is obviously a nice touch, but they belong to the sfx realm.
I worked in VFX for some VR projects, but most of them were archviz and educational, so most of the time the elements that needed those VFX were onscreen.
And yes, certainly, we need to outsmart those technical limitations! VR is evolving pretty fast, and so are we.
No alpha masks at all, so no fading in or out and no alpha tests. It was an engineering decision made in order to have higher resolution textures. It’s a somewhat open world game on the Quest. I used vertex animation for everything to get around it.
That makes no sense btw. On Quest you should be using ASTC, which having alpha in the textures has no impact on the file size, and in fact you can get better texture compression ratios compared to even 2 bpp RGB ETC by using a larger block size. 2 bit per pixel RGB ETC is a 12:1 ratio (24 bit bmp vs ETC), where as ASTC you can get the same 2 bit per pixel size using an 8x8 block size, which usually still looks miles better than ETC, and you can have alpha (though that does reduce the quality for blocks that have both alpha and color down to similar-to-ETC levels). An RGBA 8x8 ASTC has a compression ratio of 16:1. RGBA 12x12 ASTC gets you up to 36:1! 12x12 blocks are a little harder to make useful, but still valid.
Even if you were stuck with ETC, you’d usually just resort to channel packing. I.E.: storing the alpha in the blue channel. People do that on PC and console too to avoid DXT5.
Now if the request was made because of wanting to render at a higher resolution and looking to reduce fillrate, that makes a little more sense, but then your engineer should have been giving you vertex count limits too, as small polygons will kill performance just as fast, if not faster, than alpha. Fill rate is an issue on Quest, but it’s a myth that overdraw isn’t a problem on mobile. MSAA means it’s still a big issue.