Consideration for Real-Time VFX with LED-Walls/Volumes

For my master thesis, over the last year, I’ve looked into considerations that would need to be made regarding performance and visual fidelity when making real-time VFX for in-camera VFX. As I was having a hard time finding information on this. Therefor I was hoping to share a summery of some general considerations to help when starting out creating VFX for an LED-wall/volume.

By using real-time rain VFX, I recorded the VFX impact on FPS as more particles were added, and through interviews with a couple of professionals the following considerations were formulated.
The goal for these considerations were to aid in the beginning stages of making real-time VFX, not to necessarily guide the creation, by offering points to think about, experiment and test with before finalizing the VFX. As especially I’ve only been able to test with rain VFX as an example, there will likely be more considerations necessary depending on the intended VFX.

The full research, for additional information or context (like the performance test results, videos used for the interviews and the scene), and a document version of this post can be found in my link tree

Technical Considerations

  • Performance Target
    Performance target for the virtual scene on the LED-wall/volume should minimally meet the camera’s FPS, though higher FPS improves smoothness.

  • Hardware Differences
    Hardware differences between a developers device and the LED-wall/volume, need to be accounted for when targeting the required FPS on a developers pc. By testing this difference a target FPS could be formulated:
    Target FPS = Camera FPS + nDisplay + Buffer

  • GPU over CPU
    If it can be faked on the GPU instead of using a CPU VFX, this will likely give more room for performance.

  • Moiré Effect
    The moiré effect causes artifacts when recording a screen with a camera. To avoid this the LED-wall is often kept slightly out of focus. This can blur smaller details.

Visual Consideration

  • Exaggeration
    By exaggerating aspects of the VFX slightly it can help to enhance visibility and improve recognizability. Having the effect work in context of the scene can be more important than a high visual fidelity/realistic look.

  • Randomness
    Having some randomness in the effect, by using noise textures or by breaking up patterns of the particles for example, can make the scene feel more natural and avoids a too-clean look.

  • Layering
    The layering and amounts of the effects should be carefully balanced, without overwhelming the scene. Avoid uniform coverage and excessive overlap as with the effect replicated on the physical set this can quite quickly become to much.

  • Building Blocks
    Building blocks breaking up the VFX in its’ components can increase readability and make it easier to art-direct the VFX when being implemented in the scene.

  • Lighting
    Interaction between lighting and other scene elements enhances the integration and realism of the VFX.

  • Materials
    Materials reflecting realistic properties could be adding a lot to the feel of the VFX, but might take up necessary performance. For an effect closer by this might be needed, but at a further distance a simpler material could do the trick.

  • Blending
    Blending the VFX with the physical set means the VFX should be adjusted to match the solution used in the foreground on set to replicate the effect of the real world. This solution might not fully replicate the effect as if it were to occur naturally.

  • Scale
    Depending on the VFX the actual scale of the real world effect might not work well with the scale necessary for the LED-wall, an increase in scale can improve the readability and might make sure the effect is clear, and is not mistaken for another similar effect.

  • Interaction within the scene
    Having the appearance of interactions between elements in the scene and the VFX can help integrate the VFX, adding to its believability. Note that actual interaction of the VFX, simulated through for example collision detection, is not necessarily needed.

As mentioned before, I gathered these considerations through testing with real-time rain vfx, so I don’t think I’m covering the full range. If there are more perspectives or experiences to add to these considerations, I think it would be very valuable.

1 Like