Mobile games vfx

Hey guys!

I’m totally new to realtime vfx or to game VFX. I have a lot of experience working with 3ds max, fumeFX, thinking particles but I wanna learn games VFX. I would like to learn how to make effects for android or ios (mobile games). Can you guys give me some suggestions where to start? Should I use unity or UE4? Im also interested in learning houdini If it’s necessary for game VFX.

Hello and welcome on this forum.

In RT vfx you are mostly working with textures, shaders and meshes. For mobile game the particles have to be less expensive on the hardware. You cant throw in as much particles as you would on a PC. You have to find balance between good look and playable framerate (this also apply to PC and Console games).

First you need textures for anything that isnt special. For something like burning wood you can generate a fire simulation in fumefx then render it and make a spritesheet. In the engine of your choice you will use particle system to make it move. There are many tutorials for this everywhere.

For more special effects like shockwave or force field you may need to combine mesh and texture with shaders.
For Unreal Engine example check imbue AoE ground attack on youtube (link below). Its unreal engine 3 (UDK) but particle and material editor are mostly the same in UE4. And he also uses maya to create mesh.

For Unity and 3ds max example check the Ink Brush effect tutorial. Shader forge unity plugin is used along with After Effects and 3ds max. You can find it under Resources category here on RTVFX.

You should download both Unity and UE4 to see which one you like more. Particle systems are mostly the same thing in every engine. Im using UE4 because it has material editor already integrated. In unity you have to write shaders or buy a node editor like shader forge to edit shaders.

Houdini has a free apprentice version so you can use it to learn the software. You can achieve some effects much faster in it but it has a steep learning curve. On the other hand it offers freedom which I didnt found in any other big 3D package like maya or 3ds max and thats why Im using it. However its not necessary to use it in RTFX production. You will probably create some effects much faster in a software you are already comfortable working with.

Here are some great resources:
http://wiki.polycount.com/wiki/Special_Effects

The gnomon workshop is paid but I think this tutorial is good. Other paid sites like pluralsight or cgcookie also have some tutorials for this.

For houdini I can recommend you these tutorials from the same author @Partikel
https://www.pluralsight.com/courses/houdini-vfx-games

Even if you dont use houdini these tutorials can help you for example modifying smoke alpha channel in UE4 to create fire, create simple spark texture using radial gradient node and many more. There is also a powerpoint presentation on the polycount wiki (link above) from crytek which shows how to create similar splash using 3ds max.

Also check the resources category here on rtvfx.
These are also good:

Check every tutorial, podcast, interview you can find even if its not for the effect you are trying to achieve. You will find a lot of useful tips and tricks hidden everywhere.

Use google a lot when learning new engine (or anything new to you). You will find answers quickly since a lot of people had problems while learning before. If you find yourself in situation where google can help then just ask here.

Good Luck and Have a Nice Day.

EDIT: Added more info.

6 Likes

Thanks a lot! So I should first take some courses for UE4? Or should I just watch tutorials for vfx?

I recommend you to first watch some tutorials on the stuff you want to create (magic, realism) and then try to recreate it. Just experiment a lot and change different values to see how it affects your effect. Create and recreate your favorite effects from games. If you want to create effects for unreal 4 then you need to learn cascade which is the particle editor. There is also a material editor which you will need and blueprint visual programming which is helpful if you know how to use it. Find the way which you think is best for you.

1 Like

Particle effects in offline rendering tools vs vfx isn’t extremely different in how they work from a high level. The main difference is when you’re using the tools you’re used to you might make an effect with tens or hundreds of thousands of particles, for something like mobile VFX try to think of how to reproduce that with, say 10 particles. Not 10 thousand, just 10. Your overall particle budgets aren’t likely to be that limited, but you’re also rarely just doing one effect at a time. For modern high end mobile devices you might be able to get away with a few hundred particles, maybe even low thousands depending on the hardware and use case, but they’re going to be much “dumber” particles, limited to no collision or interaction with each other or other objects.

Houdini has been getting popular lately for realtime VFX just because of the breath of tools it includes, but it’s certainly not required. For realistic effects you’re likely going to be rendering out a short video of something from max or fumefx, maybe 16-64 frames, and putting that into a single large texture atlas, sometimes called a flipbook. Sometimes you can get away with just using one or two frames from something like an explosion render. One from when the explosion is big and red, and the second from when the explosion is just the smoke as it’s dissipating. You spawn the first image on one large particle sprite and scale it up quickly, then fade it out while at the same time fading in a handful of sprite particles of the dissipating smoke. Maybe you can use a handful of small, bright dots flying out from the explosion as sparks. There you go, you now have the exact same setup as probably 99% of explosion effects out there. Want something that’s not realistic? Replace those two renderings with some hand drawn images.

1 Like

But do I need to learn engine? Like if I type Unreal Engine 4 vfx beginner I will usually get tutorials like “how to create your first game” do i need to watch that? because I’m not into making games I just wanna do effects, destruction… Shoudl I just watch imbueFX? But I dont find any “beginner video”

What do you mean by learning the engine? If you mean coding or programming then no you dont need to learn that. You need to learn the particle system which is called cascade in unreal.

This official Unreal 4 tutorial should help you:

Also check the cascade documentation on epic games site.

1 Like

learn unity. The chance that you will get your hands on a unreal mobile game is quite slim.
Plus there is much more direct content to learn unity game vfx and how to approach them for mobile

Imo the most efficient way to learn it fast is, check vu ducs unity tuts. They should be a good starting point to learn the shuriken system & animation techniques in unity.

also a decent start for untiy starters

1 Like

Im actually not new to vfx or ue4, but I am new to mobile.

I will be reading up what has already been posted so far at a later date (busy busy) but I do have a question I’d like to ask right now.
Lets say I have a djinn character on mobile that has smoke/gass/whatever magical stuff around him, what would be the best approach? make it part of the character partially and use particles for details or?

I’d appreciate thoughts and suggestions :slight_smile:

look at WoW from around 2005; like raganros or a water elemental or their djinn’s
check the first 10 seconds here

that’s how I’d approach it to start for mobile - just go easy on particle counts - you can add atmosphere with about 2-3 boards
scrollx2 texture shader on ~2-3 meshes depending on how full screen it gets (overdraw and two texture lookups can get expensive)
about 40-50 embers billboards
few trail/ribbons would be nice too

throne of four winds @3:25

2 Likes

Hi Temzy,

if you doing something for mobile or tablets.
There some technical limitations, which it is important to now.
So I am speaking now from my own production experience!!!

  • just 3 UV sets possible

  • Small textures sizes, mostly maxim size 512x512. So is hard to use it for sprite animations.
    special if you need a sharp and clear shapes, you will have less frames.

  • textures have to be power of two (standard in real time games)!
    for example 8x8, 16x16, 32x32, 64x64, 128x128, 256x256, 512x512, 1024x1024, 2048x2048, 4069x4069 etc.
    could be also 1024x256 or 2048x512.
    We used square sets but this can change form engine to engine,or from production to production.
    Important is just that width and height have to be divisible by “8” and by “2”
    I would use “Power of two” also in pre-rendered stuff,
    the memory use this texture numbers more performance as random numbers.

  • Skeleton/Bone rigs takes too much performance! Use skeleton/bone meshes so less as possible.
  • No post process available! Bloom is working on the most high and Mid devices.
    Expect from special written scripts or plugins for engines. Be careful with this scripts they can took to much
    prefomance.

  • limitation of draw calls. Every Material is a one draw call. Don’t use to much materials as important.
    But the good think with effects is, mostly they just popping up for a second or few seconds.
    Depend on how long the effect is visible in the game you have some freedoms to go more crazy as usual.
    Just be care full! :slight_smile: .

  • Particle count, mesh count. Don’t use 100 of particles for a one emitter, or dont use 1000 of polygons for a mesh particle!

  • no vertex animation manipulate by pixel. (Just at higer Shader/Material Feature Level like E3.1 (vulcan or metal)).

  • reflection/refraction take to much preformance. Try to don’t use it! Maybe usefull for Highend Mobile/Tablet devices.

  • no blob meshes, until higher Feature Level, like E3.1 for Vulcan or Metal.

  • highest shader Models on mobile/tablets are “Vulcan” from Samsung and “Metal” from i OS.

If you never did real tim VFX, play first with engines and learn make your effects, and afterwords try to keep this points. Have fun!!! :slight_smile:

I hope I dind’t scared you!!! :smiley:

Cheers
Emre

4 Likes

“could be also 1024x256 or 2048x512.”

Not sure if dx or ue4 thing, but on the gpu these take up the memory of a 1024x1024 and 2048x2048 memory as they get squared. so in those cases its sometimes best to add other content to it so it becomes square-power-of-2 textures you offset in the shader.

Just a rando suggestion.

I’ve never heard of non-square textures using a square’s worth of GPU memory before for either DX or UE4. I know “NPOT” (non-power of 2) textures often use the next largest power of 2 dimension of GPU memory, especially if they are using mip maps, but not that they have to be square. I’m curious if you have a link to information on that somewhere. It’s absolutely not the case for OpenGL 2.0 or OpenGL ES 2.0 to the best of my knowledge; non square textures use less GPU memory, and as long as they’re not NPOT there’s no performance hit for using them.

However this thread is on mobile, and for mobile things can get a little “fun”.

For iOS, PVRTC textures must be square, along with power of two dimensions. The texture format doesn’t support anything else. The hardware in iOS devices supports DXTC / BC formats, but iOS does not. ASTC and PVRTC2 don’t have this limitation and are perfectly fine being non-square or even non-power of two dimensions. Unfortunately ASTC is a relatively new addition to iOS devices starting with the iPhone 6, even though it is technically a requirement of OpenGL ES 3.0, some “gles 3.0” iOS devices (like the iPhone 5S, iPad Mini 2 & 3) don’t support it! PVRTC2 is supported since the iPhone 5, though Unity still doesn’t offer it. :cry: It gets extra confusing as some engines will list a “PVRTC2”, but are actually referring to PVRTC’s 2bpp mode. Using an uncompressed texture on iOS lets you do non square and NPOT all you want, but there can be a decent performance hit for using them.

For Android, gles 2.0 devices there’s no such limitations on square or non square, but NPOT can come with a significant performance hit for the same reason as iOS, most compressed formats don’t support it so you would have to go uncompressed.

But, because of that fun issue with a ton of iOS devices out there really only supporting square power of two textures, it’s probably best to keep to that when you can. Plus, a 34x33 uncompressed texture is going to use more memory than a compressed 64x64 texture no matter which texture compression is used! Every common texture compression format is going to get at least 4:1 compression, and some get better, so just use a compressed texture.

If you’re targeting purely gles 3.0 and use ASTC there’s no requirement for power of 2, or square, and no performance impact for using them. The texture format was explicitly built to support this and still be efficient on hardware.