Realtime VFX Dictionary Project

Discussion about VFX Outsourcing:

Hey all,

Curious if anyone has any experience with outsourcing fx work, since alot more and more of the entire game development is going this way, and in atleast our studio, i think vfx is the only discipline that is not outsourcing anything.
Outsourcing actual particle effect work seems to be more headache than its worth, i was thinking about perhaps outsourcing textures, or animated texture work to be more precise, that would offload some work atleast from our tight schedule. If anyone has any experience with such, would be interesting to hear. - Nadab Göksu

We recently just started to go this route. Its a little different though since the guys doing the work are 100 miles from the studio. They drove down for a few days to train then have been doing some work. Its working out ok… but I’m spending a lot of time answering questions in email and what not more then I would like to be. - Jeremy Kendall

We outsourced some texture generation and emitter behaviors for NFS:TheRun. Decent results, but had to set them up with and suppport our entire pipeline and runtime remotely. I was and still am not a fan due to the complexity of describing the dynamic nature of effects - it takes more time than just doing the work in house. I tend to make lots of noises and use my hands and props like matchbox cars to describe annd prototype with art diirection and gameplay designers… so, we had outsourcing focus on ambient and distant environmental effects so we could iterate in house on gameplay and cinematics effects. - Andy Klein

I’m not a fan of this either. But some things are out of my control. - Jeremy Kendall

We started using FXVille for Injustice at my last job, I didn’t interact with them but my lead seemed to be happy overall with their performance and results. I think a couple factors led to using outsourcers as a viable solution (and these are completely my opinions for anyone on here from FXVille, not the words of Gilmore or any management):

  1. They handled background elements in the levels, which typically involved destruction and debris for transitions. This needed very little art direction from any lead.

  2. Since the IT department has a good VPN setup, they are able to easily run our version of UDK and check out packages from Perforce (all done vie web browser).

  3. FXville already had lots of experience not only with Cascade, but with Midway as I understand some of them are ex-Midway people (maybe wrong on that one).

  4. It can get tricky when it comes to outsources checking out files from other departments (mostly level). You have to make sure that your outsourcers are not hogging files overnight and that they are always readily available to check things in.

I can also offer some other personal experience in the past year as I’ve really been pushing myself as freelancer for side work. I think using a proprietary engine would make outsourcing a nightmare. My successful freelance clients all use UDK and it’s just as easy as downloading a specific release for compatibility. Some clients get picky about what you’re allowed to see, which also makes things very difficult on the FX side. When people ask for things like cigar smoke, bullet impacts, I think everyone here can agree that seeing your effect running in the game is a critical step in the process of making everything fit as best as possible.

I also can see where Jeremy is coming from, getting new people to learn the ropes over email can be very time consuming.

Hope this all helps! - Bill Kladis

Thanks all Bill Kladis, what did they do more for Injustice?, when you say destruction, does this mean, simulating the animation and doing particle fx aswell or?, how well can they use offline particle systems?. Visited their website but they dont show or tell anything there. - Nadab Göksu

So none of the destruction was ever baked out as a simulation, everything is just particle systems triggered through matinee & kismet events. So lots of smoke and debris meshes and such.

Could you define what you mean by “offline particles systems”? Since they could download the entire game content folder and use our special version of UDK, they could preview any level. They also have 360/ps3 test kits so they can run the game as well for testing. Let me know if that answers your question. - Bill Kladis

Should have written fluids, sorry for confusing you, just curious to know what type of applications they master besides UDK. - Nadab Göksu

the company I am working for at from home, they came to me about doing their FX. It’s been a really good relationship between us. The biggest problem is not being able to test my work in their version of Unity. Its something to do with the way their server is integrated with their Unity, I was told.

For example all the muzzle flashes (remember me asking the muzzle question ^^) had to be tested by me pressing the play button, on, off, on, off, etc. Setting the particle system to loop did not help, since it would depend on the gun and its fire rate. So they have been tweaking things on their end. - Lee Amarakoon

====================================================================================

Discussion about looping footage:

Im trying to get some Pre-Rendered footage looping within After Effects, I am duplicating the footage, splitting it in half and overlaying the first half with the last half and adjusting the opacity from 100-0 for the last half, but it still doesnt loop perfect, the first frame is constantly causing a pop in, Any tips or tricks you guys use that would help me out. - Gareth Richards

It’s a little hard to say without knowing the source material, but I’ll assume it’s not suited for ping-ponging. What I generally do, generally, is duplicate the entire clip, and offset the copy by 50% in both directions I guess I’d convey it textually like this:
original track <f0----|---- f100>
copy track |f50–><-- f49|
and then fade both out/in at their respective frames f0 and f100 (or whatever your last frame is). - Creath Carter

Creath Its just fire footage thats been rendered out from Fume, 60 .png frames imported as footage, the method you describe is what im currently doing , suppose its good to know im using the correct method it just isnt doing as it should lol! - Gareth Richards

yeah that’s strange, but I can say that I’ve used that exact method with prerendered flame effects before (coming from Maya fluids). The key I found was to make certain that neither the start or end frames ever displayed at 100% (ideally not at all, as they should be faded completely into the offsets footage by that point).
The only drawback with the method I posted above is that you end up needing about 2x the amount of footage, because blending the entire thing over itself cuts down the noticeably looped frames by half.
Come to think of it, the last time I did that it was by hand, frame by frame in photoshop. good luck! - Creath Carter

Frame… by… frame? Ouch! I usually use 3dsmax’s videpost for it since it’s stupid easy to setup and fast to render out (mostly because I haven’t had After Effects on hand until recently) - Peter Åsberg

After Effecta doesn’t always seemlessly playback in loop mode, it can hiccup when it goes from your last frame to first. It might be that your loop works but you need to render it out and check it in game or using a simpler video player like fcheck or the Ram Player in Max. - Matt Vitalone

Here’s a little trick that i like to do for looping fire footage: rather than just fading the whole clip’s opacity over the loop (which can leave some sections seeming to fade), you can animate a mask. Copy your clip, put it so it overlaps a reasonable amount, then mask it. Put a few points over the width of the flames, and animate them sweeping up, but try to match the speed of the flames as they rise. Don’t animate the few points you added evenly, but sort of sweep them in an S shape, and feather the mask edge a reasonable amount. If you’ve matched the speed of the fire really well, it should look like the flames are rising normally, and you get minimal switch over the loop - Oliver McDonald

Is the hitch in game use or in the video player? I haven’t found a video player that doesn’t hiccup on start. To test it, you can duplicate your clip a couple times and render that. If the clip is clean when you chain it like that…it should be good to go. - William Mauritzen

Looks like it was an issue with After Effects as i rendered it out and it works better (Needs more tweaking)! Cheers for the help All, good to learn there are some new techniques available - Gareth Richards

I learned a long time back that it’s kind of against the natural feel of things like fire to make them loop. Always faster/easier to just work with spawning multiple sprites that live varied lifetimes at varied trajectories/scales and fade out, then spawn new sprites. - Andy Klein

I did the smoke grenade for MW3 via a rendered FumeFx sequence, then brought it into After Effects. for the 64 frame sequence, I rendered out some extra frames (90sih total). I duped the layer, and pulled frame 65 of the above layer back in time to frame 1 with 100% opacity and animated it fading out over 9 or so frames. Worked pretty well. If I were to do it again, I’d add in Oliver McDonalds suggestion, and give it an animated matte growing radially from the center, rather than just a crossfade. A helpful trick in after effects… if you need to loop the alpha and rgb seperately, but similarly… after you have done a bunch of cleanup to one… First, start off with two raw comps, one for with just the matte, one with just the rgb… call them “base_comp_rgb” and “base_comp_a”. In a new comp, called “timeloop_a”, pull in the alpha base comp in and do whatever you’ve got to do, to loop it. for example, crop the active time range to a usable chunk… double the layer, time offset 50% so that the break point is in the middle, then crossfade 1 back into the other with it’s tail. wonky areas, add another layer, time offset, mask an area, feather, etc… Here’s the trick… Duplicate the “timeloop_a” composition and name it “timeloop_rgb”, select all the layers in it (which should be a bunch of “base_comp_a”). Then in the project view, alt click/drag “base_comp_rgb” onto one of those “base_comp_a” layers. the reference will swap, and all of your edits that you did to the alpha, are now replicated for the rgb. Tip o the day - David Johnson

====================================================================================

Discussion about lighting clouds/smoke:

How do you guys generate lighting maps for clouds? I was going to try just turning these greyscale images into normal maps but I’m afraid that’s going to look a bit wonk. -William Mauritzen

Try to search the web “rgb light rig”. I found a script for 3dsmax that works pretty well (it’s not 100% accurate but you get nice normal maps) -Francisco García-Obledo Ordóñez

That sounds great. So here’s what I worked out for synthetic but quite nice. (but slightly expensive) fog. Bump offset normal mapped and subsurface scattered lit translucencies. (hows that for a set of buzzwords?!) The transmission mask is super blurred. I’ll try to get an image okayed for release. Panning with a coordinate offset really makes lumps of fog look like they are moving around stuff…like a flow map but without an actual flow. -William Mauritzen

I’ve found that very soft smoke or mist just needs a good spherical normal and that does 90% of the work. Basically what you see below. I made a radial gradient in photoshop going from white in the center to black towards the edges. Then took that into crazybump as a heightmap. http://i.imgur.com/mUL7STh.jpg
Not sure what engine you’re using, but if you DOT this against your lighting information, you get smooth shading against your particles.
It’s helpful to look at the color channels individually inside of photoshop, and it’ll start to make more sense. Red is light coming from X, Green = Y, and Blue = Z/straight down. I explain this with some examples in the 1st chapter of the displacement tutorial on my site (it’s free). http://youtu.be/OwTqVrAQLuk?t=3m7s So you don’t have to dig around, it’s this part - Bill Kladis

What engine do you use? See if there is any support for “bend normals”, you might not need a normal map at all. - Edwin Bakker

I’m doing some skybox effects as well, and could never quite get a good lighting on some particle sprite clouds, maybe I messed up the math with the normals, but then I tried some other solution that actually gave me some really great results. It’s a bit of a hack, but - you get a software like Terragen, which is able to do some pretty realistic clouds. then you render out the same cloud in several passes - one with sunset/sunrise (light from below), one with the sun behind the cloud, and one with light above the cloud, and of course an alpha pass. Then you place them all in separate channels of the same texture and blend between them using your light information. Works really well. -Jose Teixeira

Yeah that’s a great trick Jose! I’ve done something similar with clouds rendered out of Houdini - just having several lights from a few directions and then placing those renders in the red/green/blue channels of an image. -Matt Vainio

Thanks guys! I’m now pretty much re-doing the entire skybox and weather system on the Witcher 3, and of course the art lead wants to have this gorgeous, constantly changing sky all the time, gives me these impossible references, it’s insane! I have spent months trying all kinds of tricks, everything I could think of, and this channels thing is the closest I’ve gotten to some good results. If I miraculously pull this shit off, I already told them I wanna make a GDC presentation on it next year! - Jose Teixera

I’m trying to get this cloud normal to shade right but I’m dyslexic with my transforms. The particles are rotating and the normals seem to be rotating at a right angle. Because the normal is basically a texture, I thought it needed to be transformed from tangent to world…just like the “fake light” mesh shader. That doesn’t seem to be working. Green is up, red is right. My head always spins when I try to get my transforms aligned. https://www.youtube.com/watch?v=NvVwBIRXaKw -William Mauritzen

you can create quasi normal maps from fluid/volumetrics with a simple light rig. light is additive so 4 lights at .25 intensity at 45 degree angles of each-other per axis will create an intensity of 1 pointing in the correct direction.However… the effects are nominal. crazy bump’s look honestly about the same when its lit. its at the end of the video: Vimeo -Matt Radford

So, the “rendering a normal map for clouds” failed exactly as Jordan Walker suggested it might. The weird thing being that from carefully composed angles I was able to fool myself into thinking it was doing a good job. The range of functional angles was about 120 degrees. Additionally, the normal maps seem to break down as soon as I set the particles to rotating. -William Mauritzen

You can render some clouds with afterburn or fume and light them from the cardinal directions ( up, down, left, right, front, back ) offline. Then lerp between the images in the shader. I have used this technique for other things ( a blurry lit version of the player’s face reflected on the inside of a helmet, you can’t blur a dot product ) but it will probably work for clouds too, anything with complex light scattering properties. -Michael Voeller

That diffuse wrap trick Matt suggested is working really well at softening things up. My main issues right now are a lack of animation in the normal map and incorrect edge transparency due to vertex offset. -William Mauritzen

====================================================================================

Motion Vector Blending

Hey everyone,
I’ve been encouraged to do a little write up of my Unreal 4 implementation of the frame blending technique inspired by Guerrilla Games.
Here’s an example of it in action:
http://www.klemenlozar.com/wp-conten...VExample_1.mp4
Full breakdown:

Hope you find this useful, if anything isn’t clear enough please let me know, thanks!

-Klemen Lozar

So we’re doing this without an additional motion vector pass. You calculate a crude optical flow of the image by getting a gradient of the pixel of the current frame and the next frame, and multiply it by the difference between frames. it works really well and i made it as simple as ticking on a feature “oflow flipbook”. You then have to morph between the two images. We tested this for doing 360 degree pan around style flipbooks and we got it working with as few as 16 frames with morphing. There are a few artifacts when the images are discontinuous, but its sooooo much better than frame blending. One thing we found helped ALOT is by making a control for mipping the vectors (blurring them). Usually i get alot less of those stepping artifacts from using a blurrier version of the texture (in my case i just mip the input texture to get blurrier data).-Matt Radford

Matt Radford: Regarding your first comment - is this an offline process or are you saying you do this in the shader? -Jeremy Mitchell

Its all done in the shader. Luckily at ND its easy to write custom shaders, theres just a little library now called OFlow that every particle shader in the game can access. I built it so every flipbook can easily have this feature enabled or disabled (it doesn’t come completely for free, so if your not using it, turn it off). –Matt Radford

Impressive! Looks like you removed any mysteries left concerning the mythical cross-blending flowmaps.
No excuse to not use this now!
How far could you scale down the flow map and still get decent results? The only thing that could be added to the shader would be splitting frames between RG and BA channels to reduce memory a bit more. Very cool! - -Edwin Bakker

Thanks everyone! Matt, I am blurring the motion vectors, like you suggested this helps smooth out the stepping but if you go too far you start loosing fidelity in the motion. There will always be some stepping, at least in my experience but unless you can really inspect it like in the example I posted I think it would be mostly unnoticeable during gameplay. - -Klemen Lozar

I’m concerned about the amount of space taken up by an uncompressed motion vector map, and what happens if you compress it, does it give you bad data? I have a super restrictive budget, but I’m going give this a shot. –Mark Brego

You really do not want to compress distortion/flowmaps, you get better results with just scaling the texture size down instead. - -Edwin Bakker

You start getting a lot more errors, you can imagine what compression does to the vectors. I found I could easily compress it to an eighth of the base texture. -Klemen Lozar

Another thing you guys could try out is to normalize the flowmaps. In other words: pack whatever range they are using so that they take advantage of the whole histogram. Then in the shader “de-normalize it” with a multiply and an offset.
To do the normalization you can create a python script that goes through the uncompressed raw flowmap and takes the high and min values, then scales and offsets them so that they use the whole 0 to 1 range.
This should remove stepping considerably as you are packing more data in an 8bit image. You will most likely be able to use compression at this point as well. -Norman Schaar

====================================================================================

Impressive collection of reference footage for effects: http://ref-fx.com/

====================================================================================

22 Likes