Realtime VFX Dictionary Project

Hey! Many moons ago I started a thread on tech-artists.org where I wanted to collect useful realtime vfx knowledge in one place. It started out as a dictionary, but then I started collecting reference dumps, discussions from the Facebook group and so on. I was approached by some of the fine people on this site with a question if we should revive this project here in our shiny new forum. I am 100% on board! I’ll start by bringing the old thread over. Then you can fill in what else you think should be in here. I’ll keep updating this post with your suggestions and edits. It might look like an unsorted mess to start with, but with time I think this could turn into something great! We just need to distill all the delicious infromation found below to something a bit more digestable. Any volunteers? Looking at you @Keith and @Weili.Wonga Somewhere down the line, perhaps this could be turned into a wiki and/or free book that could help new vfx artists out! wipes tear from eye. Anyway, here goes!

Edit: It seems I hit the wordlimit. I’ll have to split the post and grab a few of the following posts so I can update down the line.

====================================================================================

Alpha Erosion
Description: Scaling the the levels of the alpha texture to give the appearence of the alpha eroding, or being eaten away.
Other names: Black Point Fade, Alpha Dissolve, Alpha Min/Max Scale.

Bent sprite
Description: A method of improving the lighting on a sprite. Achieved by bending the normals away from center.
Other names: Benticle

Billboard
Description:A billboard is similar to a sprite but it’s not simulated and is usually constrained to only rotate around the up axis.
Other names: VAP (ViewAlignedPoly)

Camera Parented Effects
Description: A particle system that has been parented under the camera so it follows along with the player.
Other names: Camera proximity Effects, Screen attached effects

Depth Fading Particles
Description: The particles pixelshader performs a depthtest against the scene and fades the alpha accordingly. This gives the particle a soft look when intersecting with objects.
Other names: Soft particles, Z-Feathered particles, Depthtest particles.

Distance Culling
Description: A way to stop the particles from simulating/rendering after a specified distance.
Other names: Cullradius

Distance Fade
Description: A way of fading the alpha of particles based on their distance from the camera. This could be used to stop particles from clipping the screen.
Other names:

Emitter
Description: The source of particles. In most engines this is what contains all the data that determines how the particles simulate. What data is present at this level varies. Some engines have all of the settings here. Some engines have an effect container above this with additional settings. The effect container usually allows you to combine several emitters to create the effect. Example: An effect container would contain a fire emitter, a smoke emitter and a spark emitter. The spark emitter would contain data like, texture, colour, initial velocity and so on.
Other names: Layer

Flipbook
Description: A sequence of textures compiled to one image. The pixelshader on the sprite moves the UVs to the different sections of the image and thus displaying different frames of the sequence.
Other names: Animated Texture Atlas, Spritesheet, SubUV Texture.
Common Naming Convention:
_texname_atlas = Non related frames used for random texture selection. _
texname_anim = An animation that has a start and end.
texname_loop = An animation that repeats indefinitely.

Force
Description: A force affects the velocity of a particle after it’s been born. This can be used to simulate wind, turbulence and attraction.
Other names: Attractor (Basic version of force)

Mesh Particle
Description: A polygonal mesh instanced onto the particles.
Other names: Entity particles

On Screen Effects
Description: An effect that plays on the players screen. This could be particle or shader driven. An example would be blood splatter on the camera.
Other names: Camera space effects, Fullscreen effects

Particlesystem
Description: A collection of emitters making up an effect.
Other names: Effect Container

Particle trail
Description: A particle rendering method where a quad strip is constructed from
the particle data.
Other names: Trail, Geotrail, Ribbon, Swoosh, Billboardchain

Preroll
Description: A method of running a particle system for a set amount of time “behind the scenes” so when it’s first made visible, it’s already in place.
Other names: Prewarm

Sprite
Description: The most basic tool of a vfx artist. In it’s simplest form it’s a camerafacing quad with a texture applied to it. There are countless variations to this with different alignment options but then one could argue that it’s a quad rather than a sprite.
Other names:

UV Distortion
Description: A method of distorting the UV coordinates to extra generate detail in an effect. Can be extended to be driven by a Motion Vector mask to blend smoothly between frames.
Other names: UV Warping

====================================================================================

Discussion about creating Rain:

Fill rate is your biggest enemy. We had a programmer build a culling system to keep the particle count fixed. Building a single post process would be very challenging. - William Mauritzen

We use a combo of particles, screen effects, and at base, a stateless screen attached, curve animated system of what I’ll call planes (very programmatically created and optimized) to form the largest quantity of raindrops.
I think most of this depends on what you want to do with it. Putting multiple drops in a single card will cut back your particle counts, probably increase your overdraw, but will also allow you to create a heavy storm much more easily than otherwise.
Is there a run-time interactive component like Watch Dogs slow-mo? If so, that changes things completely too, etc etc etc. - Keith Guerette

You can look at the Tricks of Seb lagarde here: http://www.fxguide.com/featured/game...onments-partb/ -Marz Marzu

We don’t use “real” particles for rain but instead have a box around the camera that is filled with drops through shader trickery. It reacts to camera movement so the faster you move the more they streak and pretty much all the settings are editable (size of rain drops, size of camera volume, alpha, brightness, camera-movement-influence and so on). For specific areas where you’re under a roof looking out we place drip effects to simulate water dripping off the edge. It’s not the best way to do it but it works well enough. In the future we want to get better distant rain, maybe using sheets with scrolling textures to get the dense, heavy rain look. Would also be nice to have splashes on the ground but they’ve been too costly in the past (and you don’t really see them when you’re driving at 120mph)… - Peter Åsberg

We do it with particles parented to the camera. Single drops closest and mutiple per plane further away. Then we have some particles playing on the camera - Andreas Glad

====================================================================================

Discussion about color remapping:

Trying to get color remapping outside of fx and the aliasing artifacts it is introducing are making it unusable. Has anyone used the technique to make color maps for lit objects without breaking the performance bank to smooth out the artifacts? Has anyone heard of the technique being used for non alpha surfaces other than L4D 2 hoard? I don’t necessarily need a solution just trying to sort out if this is even worth chasing. - Mark Teare

Try using uncompressed grayscale maps for your remap textures if possible. DDS is pretty brutal with the gradient maps for color remapping. - Matt Vitalone

I’m a little unclear what you are doing. Are you using the brightness as a UV for a gradient texture? If so, I find it helps to use a small texture. It seems counter intuitive, but if you have a large texture you will get stepping due to the fact that it is only 8 bits per channel. If your gradient crosses enough texels, you will end up with more than one pixel in a row with the same value (producing a stair step). If you have a small texture, each texel will have a different value and the interpolation will create a smooth transition. - Eben Cook

Right now what I am doing is taking a gray scale version of an artists color map and recoloring it with a gradient texture. So there is the possibility for artifacts from the gradient compression but the real ugly stuff is coming from the fact that the pixel in recolored after all of the filtering our engine is doing to get the onscreen color value (I am paraphrasing my graphics programmer). If you put the original color texture next to the remapped gray scale one in game and move around you get all kinds of noise especially when moving between mip levels. - Mark Teare

If you’re in unreal you can try various compression settings. Make sure your grayscale isn’t SRGB. Accept full dynamic range. Choose a different compression setting. There’s also a way to stop if from mipping. Choose the other filter setting and you’ll see more pixelation but less weird lerping. - Wiliam Mauritzen

Are you guys using this technique on a lot of lit surfaces? - Mark Teare

You’ll also want to make sure your LOD group isn’t smaller than your imported size.
I did a color remap that allowed for twenty slots of customization. We couldn’t have any non-adjacent colors and we had to hide seams. - William Mauritzen

We tried using it on a project and we were able to minimize the aliasing by using really smooth color ramps (not too many colors and no sudden changes) but the colors still smeared into each other a bit too much when viewed at a distance. That didn’t produce any flickering, but we would lose a lot of detail when objects were far away - Arthur Gould

Do you have an example to the artifacts you are getting for us Mark? - Edwin Bakker

Made a generic example illustrating my problem (details in my last post). I tried all of the suggestions below to lessen the problem (all of which were good tips BTW thanks) but in the end there is still fairly bad artifacts stomping/swimming all over the Anisotropic filtering when the texture is recolored using the gradient. I can try to get programming to fix this but at first glance the programmer was afraid a fix would be too costly performance wise and still not look as good. Has anyone solved this particular problem? Cheaply? - Mark Teare

Hmm… So did you try and clamp the color ramp U?
I assume you do not want the light grey areas, which I think come from the filtering between the pixels on U coordinates 0 and 1. The sample errors will get worse as the mipmap texture gets smaller. - Edwin Bakker

Well this image is just bogus Photoshop stuff so don’t read too much into it. Yeah we clamp the U. As I understand it the issue is when sampling from a colored texture the filtering has the color information but when remapping the filtering is done on the gray scale image and then the coloring is added negating the quality of the filtering.
Edwin have you shipped remapping in a scenario like this? Flat-ish tiling surface? -Mark Teare

I am afraid I have not, what happens if you use an uncompressed greyscale instead? - Edwin Bakker

What hardware/shader language is this for? It’s possible that you’re not getting good gradients for the color lookup. General gist: you get good gradient info for the greyscale texture lookup, since it’s using UVs the normal way (the amount the UV changes between each pixel is the gradient, when it’s smooth you get nicely filtered/aniso/etc. the magnitude of the change between pixels tells the gpu which mip to use). But then, think about the rate of change of the next texture lookup: it depends on greyscale between adjacent pixels, which can vary widely and change the mip you are using widely between each group of 2x2 pixels, producing inconsistent seeming results. There are related problems when doing texture fetches inside of branches (i googled and found this example: http://hacksoflife.blogspot.com/2011…l-texture.html).
So two suggestions for you: if you look at the un-remapped image, do the artifacts show up when there whereever there are high-contrast edges in the greyscale (this would suggest it might be what I rambled about). You could try to use a small color lookup texture with no mips (or maybe 1 mip) to see if you can mitgate it. But like always, running with no mips can cause it’s own headaches and shimmery problems. - Peter Demoreuille

Ah we got an expert on the field, I’ll resort back to drinking coffee. - Edwin Bakker

====================================================================================

74 Likes

Disucssion about what shaderknowledge VFX artists need:

I am brand new to the scene… well… actually I am still a student, however VFX is my home as far as I have found on the world of game creation. I have been researching quite heavily on how to get into VFX and I am a bit confused on a certain topic. Do VFX artists also program shaders (HLSL, CG, etc)? Or do they mostly just work off node based work within UDK/CryTek? This might be too broad of a topic to answer simply, however the underlying concept is should I be attempting to learn shader programming right now or should I just focus on learning VFX in one program (the aforementioned). - Anthony Davis

VFX artists are a diverse bunch; some are highly technical and write their own HLSL shaders, others are less technical and either use node-based shaders, or even use pre-existing shaders others have authored. I would look at HLSL as another tool in your belt - fantastic to have and a definitely worth the time to learn, but not 100% necessary.

Knowing how to write shaders will definitely give you an edge up though! At the very least I would learn how shaders and realtime rendering engines work, because that knowledge will help you communicate with engineers, create optimized content, and debug your work. - Chandra Foxglove

Thank you! This is what I have been doing and there is one concept that I have boiled everything down… which has kinda taken over my thought process lately. It goes a little like this: -Everything is math. Everything. So, with this concept in mine I have been doing some looking into not the understanding of the math per-say, but the concept to think in math. I love languages and everyone who is fluent in more than one language has told me the same thing. “If I want to be fluent in a language I have to literally think in that language… don’t translate in your head then speak.” So I am putting this together and trying to find a way to think in math. This is a bit abstract possibly. But it is my goal to push towards. I believe it will work me towards the best understanding of my craft. - Anthony Davis

When I write shaders, I often sit with my technical director or graphics programmer at the end of it to optimise the shaders and be selective about how we’re going to balance our bottlenecks to get the best visuals for the lowest cost. This is all after I’ve got working, prototype shaders working in the first place. (We use Unity/hlsl.)

The point here is that it’s important to know what you want the final result to look like. I think the best course would be to do whatever you can to train your eyes to know what your goal effect should look like the fastest that you can (free from technical constraints, at first). This comes from researching and reading books like Elemental Magic; from drawing; from studying video footage; from playing with the particle editors and node-based editors available to you in your game engine of choice.

The node-based editors will also generally get you thinking about what kind of technical things are available to you, without having to hunt down syntax and debug stuff and referring to a cg/hlsl appendix the whole time; and optimizing them with maths and programming tricks are relatively easy to learn (if you’ve got someone around who’s good at that stuff who can walk through them with you).

I feel that this way, you can get beautiful results (even though they’re 2D hand-painted flipbooks or something, and not technically challenging), and it’s enormously encouraging as an artist to have beautiful work to show, and to know what you’re aiming for. It’s also the thing that I feel most of us struggle with the most. The tech is easy enough to grasp with a few weeks of beating over the head by a more senior vfx artist or graphics programmer. But training your eyes and being an artist is a lifetime of effort!

At the end of the day, a great portfolio is a result of a great artist. And as a vfx artist, while optimization is arguably more important to us than other art fields, and the tech much more interesting, the art is still absolutely vital! - Jonathan Hau-Yoon

It should be mentioned that most companies will not be using UDK or Crytek. Many companies have in house proprietary software for doing FX work, and/or use less robust tools built in Maya or Max to export to an engine.

Knowing that, the best thing you can do is expose yourself to many different engines and try to find out their strengths and limitations. Often knowing what’s possible is a huge advantage - it allows you to think about how to achieve similar results using different sets of limitations.

All of that said, the best thing you can do is focus on the art. Think about 12 principles of animation, treat your FX like characters. Think about composition and context. Make sure you have large and small detail. Understanding the mathematics of color helps quite a bit too.

The FX field is deep enough that no two artists will have the same skill set, but the more things you can do, the better off you and your team will be - Adam Kugler

I appreciate your input. I do understand that most companies will have their own proprietary software to pull in their FX work, however no matter ho many times I ask Keith, I severely doubt he would just let me play with Naughty Dogs software, so I have to use what I can get my hands on. I haven’t heard anyone actually talk much about the 12 principals within FX. It may have just been implied to understand that everything with motion must use them to get a sense of realism, generally people just say work it until it feels right. Next month I get my first real taste of real time VFX within UDK. We haven’t been told what we have to do but either way I Am on the Effects crew for it. we have a month to make a level of some sort (pirate, caste, sci-fi, etc). It changes every month, so I have been watching Eat3D and ImbueFX’s information to learn about things that I can do to test and just get used to things before I delve way way to deep. I’ve also found 3dMotive and UDN for some information as well. Is there anywhere else you all would direct me to? Thank you everyone for answering my questions, its amazing how much the art community bands together for each other. - Anthony Davis

I learnt UDK, used it on the job for a year and a half, then moved onto another project which does not have a node based shader editor… and I wanted some custom shaders, so I just started coding them (CgFX, duplicating existing ones, and editing them to do what I want). Having spent a long time experimenting in node-based shaders, I’ve developed a good understanding of shader logic so I mostly know what I want to write before I start making it. -Kris Dogett

I’ll touch briefly on 12 principals and how they specifically apply to fx.

Probably the most important are anticipation, timing, and follow through. You need to prepare your audience with something that grabs their eye, and follow through with overlapping action. Successful effects have a rhythm, like a drum solo. Think of a michael bay explosion - invariably its actually 3 explosions, not 1. Or the bird of prey explosion in the old star trek movies.

Secondary action also applies to levels of detail. Make sure your FX have the main huge hero parts, but also have detail at smaller scale that assists the overall feel. The world is granular, and every phenomenon has multiple scales of detail.

Slow in and slow out applies to many things like smoke or fire. You don’t want effects like smoke to pop off. There are cases where you want fx to pop on, but most of the time you want to prepare the eye. Also punchy and popping on are different things. FX can be fast without starting at frame 7 of an explosion, etc.

The rest are fairly straight forward. You can even see things like squash and stretch in buffs for fantasy games. Start looking for it and you may be surprised by how many of the 12 principles you see in FX that appeal to you. - Adam Kugler

====================================================================================

Discussion about hiring practices with regard to film vfx vs realtime vfx:

Got a Question that give me a lot of itch and thinking of asking everybody here for their point of view and opinion.

So Today I was applying for a Vacancies via a company website. The Requirement is pretty typical;
5-10 years’ experience in videogame production. Responsible for overseeing as well as creating and implementing the in-game VFX. Deep understanding of real time VFX techniques,. Wants to push real-time VFX work to the next level. Expertise in Unreal is a plus.

After My submission, I got a reply from the Staffing Consultant from the company, informing me that they are focusing on finding people who have feature film experience and the related toolsets they use (Houdini, for example) I was confused so decided to consult and discuss the event with Bill Kladis and a couple of game industry veteran. one of my friend who is an art director claim that both Realtime FX and Post-Production FX are the same thing. Which encourage me to bring the topic here.

Is it a common concept that hiring a Post-Production FX Artist to do In-game FX assets, believing that what they did in Hollywood can be brought into their game production.

I sincerely apologize just in case someone may be find the following topic not to their liking.
I hope that your opinion able to educate me and bring some light to this mentality. - Weili Huang

For us at Naughty Dog, this is not even remotely true. We have a very difficult time finding worthy candidates from our post-FX applicants - the tools, workflows, and techniques are so completely different that it either ends up frustrating the employee making the transition “down” into games, or they come in with an ego expecting to “teach us how its done in film.”

So much of our work is about forming an illusion around the various obstacles that arise (optimization, engine, render pipeline, implementation, gameplay). For someone to ignore the value of experience from this sometimes actually offends me. I wouldn’t by any means consider myself a valid applicant for ILM because I made effects in the Uncharted games.

This is, of course, not to say that someone from film experience would not bring useful skills and background with them, as I’d like to think that if I made the jump over to film, I would bring a different sort of perspective too. But in my mind, they are as similar to each other as sculpting a face vs. painting a face. - Keith Guerette

Comparing realtime VFX to pre-rendered VFX is difficult. My opinion and experience tells me that doing pre-rendered work benefits you, because you train your artistic eye by not having to work under “technical limitations” a game engine confronts you. I like to compare it to animation…let’s say if you have animated characters for a DS or mobile game with limitations of lets say 8 frames for a walk cycle. And let’s say you have done that work for the past years and have not done any other animation job but you are really good at delivering 8 frames walk cycles. I can guarantee that you will have problems animating a character for a cinematic. Cinematics are asking for more than just a walk cycle. …subtile movements, giving the character a character - making it believable. Or at least I had made the experience (back 15 years ago) . Going back to VFX. Let’s keep in mind that vfx is art and not only a technical job. You have to train your artistic eye , e.g. timing, colors. The best way to do that (in my opinion) is to learn tools, methods etc which allow you to work and deliver a vfx that comes the closest to reality. … than you can (if you want) bring that knowledge to the “limited” field of realtime vfx. …maybe another comparison is : scaling up an image will result to loosing detail…scaling down an image gives you more flexibility (because you got the detail and you can decide what to keep and what do loose). - Andras Kavalecz

There is a big difference between real time and post production FX.

I would say there is an overlap in the creation of our source materials, but that is where it ends.
Having knowledge of fluid dynamics or being able to work with Houdini is definitely a plus. But for a real time VFX artist that is not where it ends. We have milliseconds to approximate what post production VFX artists do with render times of hours. I would say a lot of extra knowledge is required to make an effect work in real-time. - Edwin Bakker

Coming from prerendered, I can say that a lot of the knowledge is heavily applicable, especially for simulation, compositing, procedural texture generation, etc. I’m not entirely sure the staffing agent knows that, but I do know that companies like squaresoft do use houdini as their primary 3d package for creating fx.

If the game is super focused on realism, maybe they want film quality explosions as sequences (not that they are strictly correct in seeking that), and a sprite artist isn’t necessarily going to understand maya fluids or nDynamics.

I think in the end, the staffing agent possibly may not know what an fx artist for a game really does (which is common), or they may be looking for highly realistic examples of work rather than fantasy, which to them means film experience is better than fantasy rpg experience. - Adam Kugler

It’s possible that the position they were hiring for was in game cinematic and not actually real time. Recruiters are not typically knowledgeable. Depending on the company, asking for film experience but expecting realtime performance could be a red flag. - William Mauritzen

Keith took the words right out if my mouth. I would like to add that it may be a common misconception by a recruiter that the techniques are exactly the same. I feel that this posting is very misleading since it put so much emphasis on having skills related to realtime production asset creation and implementation. If you don’t appear to know what you’re talking about when recruiting you don’t only lose out on potentially great talent but you make your studio/company look like a less than desirable place to work damaging its credibility. That’s just my opinion. - Doug Holder

Having not worked in Post FX, I could be wrong… but I imagine there are a lot of ‘principles’ (simulation, timing etc) that can be brought from Post FX to real-time… but the work itself will be vastly different in the procedures used. Having talked to a few friends who work in Post FX, it seems that you’re pretty much only limited by the time it takes to render the scene… where as in real-time, you’re limited by your runtime and engine constraints. - Pete Clark

houdini is something that comes in handy for a lot of things but i think there are a ton of other skills that are a lot more important like understanding the concept of overdraw, drawcalls and performance in general. there are more possibilities opening up with next gen consoles but the fact that realtime rendering needs to be much more flexible is never going to change. it’s a completely different approach - Hanno Hinkelbein

Keith, Andras, Edwin, Adam Peter, Adam, William, Doug, Peter and Hanno. Thanks So Much for your Opinion, I humbly learn alot from you all. Kudo !

@ Adam Peter - I am sort of getting used to the trolling not only from recruiters and HR but seasoned art directors as well! I emailed to the Staffing Consultant mentioning about the job decription. He did replied apologizing that the job descriptions are somewhat mis-leading, but they are in fact looking for what he had described in his earlier email!

I used to do Post-Production decades ago so I understand the difference between these 2 discipline, sometime trying to educate to fellow game developers who kept on argue why do sprites that face the camera all the time.

Like all of you mention.We can use some of the post production skills to create sprite sheet or simulation to that can be exported into game engines, Source Materials. then from then on thats where other part of our skills will kick in and make it run smoothly in real-time.

@ Keith - I can feel the frustration when the Post FX artist use their ego and trying to convince you to do things their way. It brought be back memories of training our Shader Artist whose extremely good in Maths, learning to do Visual Effect. He is unconvinced of the way we did our job and try to convince us with his Maths Formula in how gravity works!!!
By the way Keith, I will definitely use your Sculpting Face Vs Painting Face Metaphor, it is awesome. (need to tattoo it on myself somewhere…)

@ Andras - I understand your 8-frame Walk cycle concept relating to FX. I am a trained animator as well. Until now i am still worried will I be able to go back to animating cartoons. Its even harder since I spend a decade doing VFX. Having worked in Post- Production does train our Artistic Instinct and emulate that into Real-time via the Game Engine. But that is most rewarding part. - Weili Huang

We’ve done some hiring lately for both game vfx people and film vfx artists, with great results. I think a well rounded team will have people from both. It’s helpful to play to peoples strengths. I recommend not trying to train a film guy to right away behave like a constraint minded game artist and crank out lots of little efficient effects. Keep them in their comfort zone, have them render textures, do destruction, use the tools they are familiar with, blow out big moments. That has worked well for us.

So, to answer your question directly… yes, we’ve hired film guys to do in game vfx… but we’re not putting them on a level like a standard game fx guy. We’re giving them specific tasks that they can excel at. - David Johnson

Sony London: We’ve hired from television and film. Candidates require some training to get up to speed with the team structures and workflows (the basic tools but also perforce, revision control, dependencies from your asssets to the rest of the game etc etc). Techniques aren’t directly applicable but the principles are definitely.
Oh yes - and the idea of an effects performance takes some getting used to for them - Ivan Pedersen

====================================================================================

Discussion about Noise Map Generation:

Hey guys. Simple question. What do you guys like to use to generate different noise maps? - Rod Cano

Filter forge is pretty awesome, it works as a standalone and photoshop plugin. It’s a node based image processor that also has nodes for generating various noises. - Jordan Walker (http://www.filterforge.com/)

I use combinations of max procedural noises a lot because they are automatically seamless on characters. For tiling textures I often jump into world machine. Mostly because I have it and know how to get what I want out of it (and it is node based). - Mark Teare

honestly you can get just about anything with maya’s base procedurals, you can also get a lot out of the 2d/3d fluid containers. Substances in Maya 2012 are also pretty robust, especially with the bonus pack http://www.allegorithmic.com/… - Adam Kugler

I use Maya and Max texture nodes for the creation of these. After Effects has some pretty cool noise map generator as well. - Martin La Land Romero

After Effects’ filters are pretty cool for textures. This brings to mind a question though, didn’t someone mention an easy way to make them tile, outside of photochopping them? - Leo Braz

If you just need a quick tiling noise to layer in, P-shop’s clouds will tile. Interested to hear the real answer to the tiling question though. I usually resort to tiling them on my own. - Mark Teare

http://www.youtube.com/watch?v=SX4rSBBR8gU built into substances, both your own images and ones that already exist. Another fun trick to make tiling noise textures in a 3d package is to make a 3d procedural on a nurbs torus and bake out the texture (since it’s automatically in 0-1 space). you get a bit of stretching, but depending what you need, it works very well. this is a great resource for the technically inclined http://www.gamedev.net/blog/33/entry...eamless-noise/ - Adam Kugler

Additional link posted by Gaxx: http://www.neilblevins.com/cg_education/procedural_noise/procedural_noise.html

====================================================================================

Awesome Reference Collection by Denis Girard:

http://6packofc4.tumblr.com/ (Explosions)
http://heatdistortion.tumblr.com/ (Fire)
http://shadesofsmoke.tumblr.com/ (Smoke)
http://watersip.tumblr.com/ (Liquids)
http://billionvolts.tumblr.com/ (Electricity)
http://scifivfx.tumblr.com/ (Sci-Fi)

====================================================================================

25 Likes

Discussion about VFX Outsourcing:

Hey all,

Curious if anyone has any experience with outsourcing fx work, since alot more and more of the entire game development is going this way, and in atleast our studio, i think vfx is the only discipline that is not outsourcing anything.
Outsourcing actual particle effect work seems to be more headache than its worth, i was thinking about perhaps outsourcing textures, or animated texture work to be more precise, that would offload some work atleast from our tight schedule. If anyone has any experience with such, would be interesting to hear. - Nadab Göksu

We recently just started to go this route. Its a little different though since the guys doing the work are 100 miles from the studio. They drove down for a few days to train then have been doing some work. Its working out ok… but I’m spending a lot of time answering questions in email and what not more then I would like to be. - Jeremy Kendall

We outsourced some texture generation and emitter behaviors for NFS:TheRun. Decent results, but had to set them up with and suppport our entire pipeline and runtime remotely. I was and still am not a fan due to the complexity of describing the dynamic nature of effects - it takes more time than just doing the work in house. I tend to make lots of noises and use my hands and props like matchbox cars to describe annd prototype with art diirection and gameplay designers… so, we had outsourcing focus on ambient and distant environmental effects so we could iterate in house on gameplay and cinematics effects. - Andy Klein

I’m not a fan of this either. But some things are out of my control. - Jeremy Kendall

We started using FXVille for Injustice at my last job, I didn’t interact with them but my lead seemed to be happy overall with their performance and results. I think a couple factors led to using outsourcers as a viable solution (and these are completely my opinions for anyone on here from FXVille, not the words of Gilmore or any management):

  1. They handled background elements in the levels, which typically involved destruction and debris for transitions. This needed very little art direction from any lead.

  2. Since the IT department has a good VPN setup, they are able to easily run our version of UDK and check out packages from Perforce (all done vie web browser).

  3. FXville already had lots of experience not only with Cascade, but with Midway as I understand some of them are ex-Midway people (maybe wrong on that one).

  4. It can get tricky when it comes to outsources checking out files from other departments (mostly level). You have to make sure that your outsourcers are not hogging files overnight and that they are always readily available to check things in.

I can also offer some other personal experience in the past year as I’ve really been pushing myself as freelancer for side work. I think using a proprietary engine would make outsourcing a nightmare. My successful freelance clients all use UDK and it’s just as easy as downloading a specific release for compatibility. Some clients get picky about what you’re allowed to see, which also makes things very difficult on the FX side. When people ask for things like cigar smoke, bullet impacts, I think everyone here can agree that seeing your effect running in the game is a critical step in the process of making everything fit as best as possible.

I also can see where Jeremy is coming from, getting new people to learn the ropes over email can be very time consuming.

Hope this all helps! - Bill Kladis

Thanks all Bill Kladis, what did they do more for Injustice?, when you say destruction, does this mean, simulating the animation and doing particle fx aswell or?, how well can they use offline particle systems?. Visited their website but they dont show or tell anything there. - Nadab Göksu

So none of the destruction was ever baked out as a simulation, everything is just particle systems triggered through matinee & kismet events. So lots of smoke and debris meshes and such.

Could you define what you mean by “offline particles systems”? Since they could download the entire game content folder and use our special version of UDK, they could preview any level. They also have 360/ps3 test kits so they can run the game as well for testing. Let me know if that answers your question. - Bill Kladis

Should have written fluids, sorry for confusing you, just curious to know what type of applications they master besides UDK. - Nadab Göksu

the company I am working for at from home, they came to me about doing their FX. It’s been a really good relationship between us. The biggest problem is not being able to test my work in their version of Unity. Its something to do with the way their server is integrated with their Unity, I was told.

For example all the muzzle flashes (remember me asking the muzzle question ^^) had to be tested by me pressing the play button, on, off, on, off, etc. Setting the particle system to loop did not help, since it would depend on the gun and its fire rate. So they have been tweaking things on their end. - Lee Amarakoon

====================================================================================

Discussion about looping footage:

Im trying to get some Pre-Rendered footage looping within After Effects, I am duplicating the footage, splitting it in half and overlaying the first half with the last half and adjusting the opacity from 100-0 for the last half, but it still doesnt loop perfect, the first frame is constantly causing a pop in, Any tips or tricks you guys use that would help me out. - Gareth Richards

It’s a little hard to say without knowing the source material, but I’ll assume it’s not suited for ping-ponging. What I generally do, generally, is duplicate the entire clip, and offset the copy by 50% in both directions I guess I’d convey it textually like this:
original track <f0----|---- f100>
copy track |f50–><-- f49|
and then fade both out/in at their respective frames f0 and f100 (or whatever your last frame is). - Creath Carter

Creath Its just fire footage thats been rendered out from Fume, 60 .png frames imported as footage, the method you describe is what im currently doing , suppose its good to know im using the correct method it just isnt doing as it should lol! - Gareth Richards

yeah that’s strange, but I can say that I’ve used that exact method with prerendered flame effects before (coming from Maya fluids). The key I found was to make certain that neither the start or end frames ever displayed at 100% (ideally not at all, as they should be faded completely into the offsets footage by that point).
The only drawback with the method I posted above is that you end up needing about 2x the amount of footage, because blending the entire thing over itself cuts down the noticeably looped frames by half.
Come to think of it, the last time I did that it was by hand, frame by frame in photoshop. good luck! - Creath Carter

Frame… by… frame? Ouch! I usually use 3dsmax’s videpost for it since it’s stupid easy to setup and fast to render out (mostly because I haven’t had After Effects on hand until recently) - Peter Åsberg

After Effecta doesn’t always seemlessly playback in loop mode, it can hiccup when it goes from your last frame to first. It might be that your loop works but you need to render it out and check it in game or using a simpler video player like fcheck or the Ram Player in Max. - Matt Vitalone

Here’s a little trick that i like to do for looping fire footage: rather than just fading the whole clip’s opacity over the loop (which can leave some sections seeming to fade), you can animate a mask. Copy your clip, put it so it overlaps a reasonable amount, then mask it. Put a few points over the width of the flames, and animate them sweeping up, but try to match the speed of the flames as they rise. Don’t animate the few points you added evenly, but sort of sweep them in an S shape, and feather the mask edge a reasonable amount. If you’ve matched the speed of the fire really well, it should look like the flames are rising normally, and you get minimal switch over the loop - Oliver McDonald

Is the hitch in game use or in the video player? I haven’t found a video player that doesn’t hiccup on start. To test it, you can duplicate your clip a couple times and render that. If the clip is clean when you chain it like that…it should be good to go. - William Mauritzen

Looks like it was an issue with After Effects as i rendered it out and it works better (Needs more tweaking)! Cheers for the help All, good to learn there are some new techniques available - Gareth Richards

I learned a long time back that it’s kind of against the natural feel of things like fire to make them loop. Always faster/easier to just work with spawning multiple sprites that live varied lifetimes at varied trajectories/scales and fade out, then spawn new sprites. - Andy Klein

I did the smoke grenade for MW3 via a rendered FumeFx sequence, then brought it into After Effects. for the 64 frame sequence, I rendered out some extra frames (90sih total). I duped the layer, and pulled frame 65 of the above layer back in time to frame 1 with 100% opacity and animated it fading out over 9 or so frames. Worked pretty well. If I were to do it again, I’d add in Oliver McDonalds suggestion, and give it an animated matte growing radially from the center, rather than just a crossfade. A helpful trick in after effects… if you need to loop the alpha and rgb seperately, but similarly… after you have done a bunch of cleanup to one… First, start off with two raw comps, one for with just the matte, one with just the rgb… call them “base_comp_rgb” and “base_comp_a”. In a new comp, called “timeloop_a”, pull in the alpha base comp in and do whatever you’ve got to do, to loop it. for example, crop the active time range to a usable chunk… double the layer, time offset 50% so that the break point is in the middle, then crossfade 1 back into the other with it’s tail. wonky areas, add another layer, time offset, mask an area, feather, etc… Here’s the trick… Duplicate the “timeloop_a” composition and name it “timeloop_rgb”, select all the layers in it (which should be a bunch of “base_comp_a”). Then in the project view, alt click/drag “base_comp_rgb” onto one of those “base_comp_a” layers. the reference will swap, and all of your edits that you did to the alpha, are now replicated for the rgb. Tip o the day - David Johnson

====================================================================================

Discussion about lighting clouds/smoke:

How do you guys generate lighting maps for clouds? I was going to try just turning these greyscale images into normal maps but I’m afraid that’s going to look a bit wonk. -William Mauritzen

Try to search the web “rgb light rig”. I found a script for 3dsmax that works pretty well (it’s not 100% accurate but you get nice normal maps) -Francisco García-Obledo Ordóñez

That sounds great. So here’s what I worked out for synthetic but quite nice. (but slightly expensive) fog. Bump offset normal mapped and subsurface scattered lit translucencies. (hows that for a set of buzzwords?!) The transmission mask is super blurred. I’ll try to get an image okayed for release. Panning with a coordinate offset really makes lumps of fog look like they are moving around stuff…like a flow map but without an actual flow. -William Mauritzen

I’ve found that very soft smoke or mist just needs a good spherical normal and that does 90% of the work. Basically what you see below. I made a radial gradient in photoshop going from white in the center to black towards the edges. Then took that into crazybump as a heightmap. http://i.imgur.com/mUL7STh.jpg
Not sure what engine you’re using, but if you DOT this against your lighting information, you get smooth shading against your particles.
It’s helpful to look at the color channels individually inside of photoshop, and it’ll start to make more sense. Red is light coming from X, Green = Y, and Blue = Z/straight down. I explain this with some examples in the 1st chapter of the displacement tutorial on my site (it’s free). http://youtu.be/OwTqVrAQLuk?t=3m7s So you don’t have to dig around, it’s this part - Bill Kladis

What engine do you use? See if there is any support for “bend normals”, you might not need a normal map at all. - Edwin Bakker

I’m doing some skybox effects as well, and could never quite get a good lighting on some particle sprite clouds, maybe I messed up the math with the normals, but then I tried some other solution that actually gave me some really great results. It’s a bit of a hack, but - you get a software like Terragen, which is able to do some pretty realistic clouds. then you render out the same cloud in several passes - one with sunset/sunrise (light from below), one with the sun behind the cloud, and one with light above the cloud, and of course an alpha pass. Then you place them all in separate channels of the same texture and blend between them using your light information. Works really well. -Jose Teixeira

Yeah that’s a great trick Jose! I’ve done something similar with clouds rendered out of Houdini - just having several lights from a few directions and then placing those renders in the red/green/blue channels of an image. -Matt Vainio

Thanks guys! I’m now pretty much re-doing the entire skybox and weather system on the Witcher 3, and of course the art lead wants to have this gorgeous, constantly changing sky all the time, gives me these impossible references, it’s insane! I have spent months trying all kinds of tricks, everything I could think of, and this channels thing is the closest I’ve gotten to some good results. If I miraculously pull this shit off, I already told them I wanna make a GDC presentation on it next year! - Jose Teixera

I’m trying to get this cloud normal to shade right but I’m dyslexic with my transforms. The particles are rotating and the normals seem to be rotating at a right angle. Because the normal is basically a texture, I thought it needed to be transformed from tangent to world…just like the “fake light” mesh shader. That doesn’t seem to be working. Green is up, red is right. My head always spins when I try to get my transforms aligned. https://www.youtube.com/watch?v=NvVwBIRXaKw -William Mauritzen

you can create quasi normal maps from fluid/volumetrics with a simple light rig. light is additive so 4 lights at .25 intensity at 45 degree angles of each-other per axis will create an intensity of 1 pointing in the correct direction.However… the effects are nominal. crazy bump’s look honestly about the same when its lit. its at the end of the video: Vimeo -Matt Radford

So, the “rendering a normal map for clouds” failed exactly as Jordan Walker suggested it might. The weird thing being that from carefully composed angles I was able to fool myself into thinking it was doing a good job. The range of functional angles was about 120 degrees. Additionally, the normal maps seem to break down as soon as I set the particles to rotating. -William Mauritzen

You can render some clouds with afterburn or fume and light them from the cardinal directions ( up, down, left, right, front, back ) offline. Then lerp between the images in the shader. I have used this technique for other things ( a blurry lit version of the player’s face reflected on the inside of a helmet, you can’t blur a dot product ) but it will probably work for clouds too, anything with complex light scattering properties. -Michael Voeller

That diffuse wrap trick Matt suggested is working really well at softening things up. My main issues right now are a lack of animation in the normal map and incorrect edge transparency due to vertex offset. -William Mauritzen

====================================================================================

Motion Vector Blending

Hey everyone,
I’ve been encouraged to do a little write up of my Unreal 4 implementation of the frame blending technique inspired by Guerrilla Games.
Here’s an example of it in action:
http://www.klemenlozar.com/wp-conten...VExample_1.mp4
Full breakdown:

Hope you find this useful, if anything isn’t clear enough please let me know, thanks!

-Klemen Lozar

So we’re doing this without an additional motion vector pass. You calculate a crude optical flow of the image by getting a gradient of the pixel of the current frame and the next frame, and multiply it by the difference between frames. it works really well and i made it as simple as ticking on a feature “oflow flipbook”. You then have to morph between the two images. We tested this for doing 360 degree pan around style flipbooks and we got it working with as few as 16 frames with morphing. There are a few artifacts when the images are discontinuous, but its sooooo much better than frame blending. One thing we found helped ALOT is by making a control for mipping the vectors (blurring them). Usually i get alot less of those stepping artifacts from using a blurrier version of the texture (in my case i just mip the input texture to get blurrier data).-Matt Radford

Matt Radford: Regarding your first comment - is this an offline process or are you saying you do this in the shader? -Jeremy Mitchell

Its all done in the shader. Luckily at ND its easy to write custom shaders, theres just a little library now called OFlow that every particle shader in the game can access. I built it so every flipbook can easily have this feature enabled or disabled (it doesn’t come completely for free, so if your not using it, turn it off). –Matt Radford

Impressive! Looks like you removed any mysteries left concerning the mythical cross-blending flowmaps.
No excuse to not use this now!
How far could you scale down the flow map and still get decent results? The only thing that could be added to the shader would be splitting frames between RG and BA channels to reduce memory a bit more. Very cool! - -Edwin Bakker

Thanks everyone! Matt, I am blurring the motion vectors, like you suggested this helps smooth out the stepping but if you go too far you start loosing fidelity in the motion. There will always be some stepping, at least in my experience but unless you can really inspect it like in the example I posted I think it would be mostly unnoticeable during gameplay. - -Klemen Lozar

I’m concerned about the amount of space taken up by an uncompressed motion vector map, and what happens if you compress it, does it give you bad data? I have a super restrictive budget, but I’m going give this a shot. –Mark Brego

You really do not want to compress distortion/flowmaps, you get better results with just scaling the texture size down instead. - -Edwin Bakker

You start getting a lot more errors, you can imagine what compression does to the vectors. I found I could easily compress it to an eighth of the base texture. -Klemen Lozar

Another thing you guys could try out is to normalize the flowmaps. In other words: pack whatever range they are using so that they take advantage of the whole histogram. Then in the shader “de-normalize it” with a multiply and an offset.
To do the normalization you can create a python script that goes through the uncompressed raw flowmap and takes the high and min values, then scales and offsets them so that they use the whole 0 to 1 range.
This should remove stepping considerably as you are packing more data in an 8bit image. You will most likely be able to use compression at this point as well. -Norman Schaar

====================================================================================

Impressive collection of reference footage for effects: http://ref-fx.com/

====================================================================================

22 Likes

More info here in the future.

4 Likes

Saving this space also.

3 Likes

Woah! That’s a lot of text jam packed with useful info! Let’s find a good way to digest this!

Hey - I thought that I might share this link about noise generation, very usefull for me - http://www.neilblevins.com/cg_education/procedural_noise/procedural_noise.html

2 Likes

thanks for sharing, really helpful

woah, tons of info, thank you for your work! :slight_smile:

This is great - Would you mind sharing that as a separate topic so it can be found more easily?

Hey @ChezStephaneNepton! Welcome! But I think you’ve posted a reply to the wrong topic - I’m guessing you wanted the Introduce Yourself topic

I’d love to see visual examples of these terms.

I think Alpha Erosion gets used quite a bit in fluids so the image on the sprite gets thinner as they fly away from each other. I’ve heard it called “Alpha Threshold”.

Bent sprite is a neat term. I’ve always heard them called “shaded sprites” since they take on shadows like the 3D stuff in the scene does.

On Screen Effects were called “screen space” when I used them for water splashing on the screen in JAK: X. They could also be used to work with the UI like making life bars be on fire and things like that.

Particlesystem is tricky. I usually call them a “particle group” because it is a group of particle systems. Sometimes the particle system is called a “part” because it is a PARTicle system and is PART of the group.

Preroll is my chosen term as well, but it’s often expressed as “negative time offset” because whereas a positive time offset makes an emitter wait a certain amount of time to begin emitting, a negative time offset means the system would have already been simulating for that amount of time when you tell it to spawn.

I would like to come up with a unified term for specifying a value in a channel (usually the alpha channel) and only have that value be visible to the sprite. We used “alpha slice” for that at Riot. It allows for very smooth animations and is nice for smooth electric or plasma-ey looking effects.

4 Likes

That’s super cool. Any idea how to reproduce this in other engines?

This seems awfully interesting, gonna take my time to delve through it!

Hey! I’ve rebooted this project and shifted the focus over to become THE realtimevfx glossary. It’s going to be a huge undertaking, but I’ve started now at least!

So far I’ve made the list of terms I want to cover. The ones I think we as VFX artists need to know. I’d love if you could have a read through it and let me know if I’ve missed any important terms. I haven’t added any of the descriptions yet, so the list isn’t super useful just yet. Let me know what I should add or if you want to contribute to the list.

8 Likes

The glossary project is moving forward! It’s jumped from google sheets to its very own Wiki! It’s early days still and my first goal is to get a basic description in for every term in the glossary.

Feel free to check it out already, and if you see something you know about, don’t hesitate to add that info in there!

I don’t know how I feel about the fact that the most visited page is about Curvemeshes. Just because there’s so much debate about the name :stuck_out_tongue:

7 Likes

OH SNAP! AWESOME! I’m ecstatic this is finally going to be a (only slightly overdue :wink: ) thing!

On that note, I’ve never wiki’d before. Is there… is there a permissions thing or… How does this work? I would love to put in the data I have collated in my notes.

What is your idea or “vision” for this wiki @Partikel? Since Realtime VFX can touch on just about every other discipline there’s bound to be some overlap. Is this wiki going to be a free for all in data as long as its even possibly useful for Realtime VFX, or is there a point where things get clamped?

Also - Reminder that there is a dated content in the wiki at Polycount for Realtime VFX & Tech Art.

One more thing. In reference to your VFX Glossary, there should be the names for the distinct types of noise. Things like Boolean, Cloud, Cell, and Voronoi. Gerstner Waves too, though admittedly that is bit more niche.

Thank you! :slight_smile:

Everyone is allowed to contribute, however I am curating it so I don’t allow anonymous edits.
My vision for it is to be a practical and concise guide. No unnecessary fluff.

The vfx wiki on polycount you linked to, is the whole thing. One page. I also think it’d be strange to build it on a wiki that belongs to a different forum. If this one had a better wiki system I would have been happy to do it here, but instead I opted for a freestanding one. Keeping it focused on realtimevfx and not art in general let’s us make some assumptions and the ability to make it less generic.

I’ll add the noises.



lol.