A legit request about vfx software used?

HI, I am in the vfx industry as a community builder, in social media and a working knowledge of some software. I wonder if Real-Time VFX can have a space where VFX artists can mentioned software used in VFX since we cannot really separate the FX from the software. In my role, I do a lot of insights or new talent interview for 3dsMax/TP artists and they do share valuable information. It can be tough for those like me not in production to know what tools the artists use and it will help a lot if artists can also mention the software. :slight_smile:

1 Like

A majority of game engines have a built in system for VFX, which will be the primary way artists author effects. So Unity/Unreal/Frostbite/whatever all count as VFX software - in a sense.

In terms of creating textures and meshes that the effects are comprised of, this is more up to personal preference. FumeFX, Houdini, and Maya are popular choices for creating photorealistic textures of smoke and fire, while Photoshop and Substance Designer are commonly used to create different types of patterns and masks.

That aside, my personal opinion is that migrating between different software isn’t too hard and something most artists are accustomed to doing. Being proficient in a certain software is useful, but it’s more important to possess a solid understanding of animation, composition, and the technical aspects of realtime graphics!

2 Likes

Thanks Chris. Any 3dsMax ? :slight_smile:

I personally mainly use inkscape, photoshop, and max for my vfx stuff in max.
then again, I dont do anything really realistic based stuff.

1 Like

3dsMax/Maya + most of the Adobe suite (Photoshop, AfterEffects, Premiere, Animate/Flash) for me.

Generally, a 3D package of your preference, an drawing/editing software, and then further software that’s more specific to stylistic goals

1 Like

I use everything that does anything with graphics. If you can think about it, I’ve used it at some point. So just to name a few


Maya + a slew of plugins
3ds Max + a slew of plugins
Softimage
Blender
Cinema 4D
Mudbox
ZBrush
Substance Designer
Photoshop
After Effect
Premiere
Houdini
PopcornFX
Unity3D
Lumberyard
UDK / UE3 / UE4
Fragmentarium
Bryce
An embarrassingly large amount of proprietary tools / engines.
SlimDX
ShaderToy
RenderMonkey

etc


When it comes to RT VFX we’re all just trying to make the best with what we can afford in perfs. To that end, any software you come across can become an arrow in your quiver, a tool in your swiss knife. If anything it’ll teach you some tricks you can find use for later on in your career.

RT VFX artist are from VERY varied background and expertise. Some came from movie industry, some from motion graphics, some from illustration, some from more traditional 3D modeling
 On all projects I worked on that had more than just me as a VFX artist, each member of the team had different expertise and could teach one another a bunch of things. So what software ends up being used is most often a combination of what the individual knows / has mastery of + the game engine you happen to work with. (or what licences your office can afford / choose to pay for)

4 Likes

Luos, and the rest: I see this forum is on RealVFX so can someone tell me what exactly is the difference btwn real vfx and everyday film post prod vfx? And is this forum only for game vfx or includes films as well ? :slight_smile:

Thanks Sarah, so mostly 2D materials for you and not so much on 3D?

Thanks Mederic, Wow! you must have invest years in getting all these software/plugins knowledge , although I see you specifically don’t know TP :slight_smile: And you are proficient in both 2D graphics and 3D Max/Maya/Blender/Zbrush
 environments? That’s a long list, hopefully there’s a project in the near future which will introduce you to TP as well, or finalRender. :slight_smile:

to me, if the effect can run in realtime its
 well
 realtimevfx.
If it takes time to render, simulate, etc
 its not realtime.
you can obviously render out, simulate stuff, turn it into allembic/flipbooks/vertex animations so it runs realtime.

1 Like

RealTIME VFX (Another word for it would be Game VFX)

If it takes more than 33ms to simulate and render, it’s not realtime. :slight_smile:

1 Like

As was said a bit above, real time means it’s happening as you see it. Think about rendering time, plus simulation time, add them together and the results we measure it in milliseconds. The road to get there is paved with many software we use and extract parts of data from that we’ll combine at a later point for the final VFX to be able to run in a viewport, not a video. So it’s not just game VFX, but VFX that the “render time” is less than the frame time(less than 33ms @ 30 frames per second for example). So that we can rotate the view around them and look at them from any angle as they compute in less than a frame so there is no need to encode it in a static video. So if they are not video renders, then they are rendered “real time” computed as you see it.

1 Like

this was a question I nearly posted as it’s own thread. as hardware and software progress the /ms and things you can create in RT changes. just the advent of the smartphone and it’s progress changes that envelope dramatically

i do hand drawn effects that are simply flipbooks on a billboard and “magically” are real-time (?) I’ve actually held off on a portfolio dump due to the work being largely 2D
they are rendered work imo even though I love seeing it here on RTVFX

so where is that line?
(sort of a rhetorical question in many ways as I can’t define it)

1 Like

@ Mederic, thank-you for the elaboration. Would RealTIME VFX (game or film) be under ‘post-production’ or ‘production’. I am still trying to figure out how there isn’t any need to render something before releasing it as final? So basically, Realtime or Not Realtime is just categorized based on ‘rendering time’ : anything more than 33ms (most large productions) will not be real time. This would mean that only small projects are seen as Realtime. So, Realtime = WIP (production) and anything that is not post-production (since we are talking viewport sims).

So Torbach, if you use something like finalToon lines and cartoons): cebas.com/finalToon
Would that be possible in realtime or such additional line styles would slow everything down on mobile?

For reference, see games RnD with finalToon: https://youtu.be/xTpnPPA6PxA
Is this kind of tutorial useful to people in this forum at all? :slight_smile:

Ced

adding outlines to 3D work in realtime is possible - a rendering engineer/tech artist facilitates this.

I think rendering techniques like finalToon is only relevant when implemented in a game engine shader, not a rendering package like max

1 Like

Things like that are implemented as a post process in games.

1 Like

It’s still rendering, it’s still released as final
 and there is still a battle of who’s doing it best. I think where you fail to understand, and i say this because of your choice of words, is what you consider “final”. You’re trying to put separation and boxes where there isn’t any.

It’s not just about films, or games. It’s also about anything that needs representation in 3D / VR / Screen. So “production” or “Post-production” really depends on the project. Like
 What’s “Post Production” for a Medical Scan device? Real time 3D imagery of medical scans. If you can view the 3D scan in VR or on screen directly as the camera/scanner is moving inside someone’s body. Responsiveness is really important. So you need to take the scanning device data, build the 3D representation and represent it in less than 33ms (for 30 fps) or less. So that the model is responsive and the tool useful.

At the end of the day, all it is, is that the image is rendered by a renderer and it’s rendered in less than what a single frame takes in your view refresh rate.

Technically Windows renders the very window you openned to browse this forum. Do you need to wait for render time to move that window around? No. Cause it’s rendered in Realtime, aka faster than your monitor refresh rate. Is it final? Yes, very much so.

Usually Software Renderer like VRay will never reach real time. It’s just not fast enough. Game engines render using Hardware rendering interfaces like OpenGL or DirectX. Which comes with its technical challenges and ways of working that differs greatly from working with Software rendering pipelines.

It seems to me that you are unaware or ignoring hardware rendering side entirely.

2 Likes