Today, was revealed the Visual Effect Graph, the new entrypoint for next-gen Visual effects in Unity. It is targetting modern compute-capable hardware targets and renders with the HD Render Pipeline on PC and Consoles. Currently, it’s released as a preview running starting 2018.3.
It relies on standalone effect assets containing multiple effect components (particle systems, meshes), a parameter interface and events to trigger spawn behaviors. The graph uses a top-to bottom flow for logic, coupled to a left- to right logic to compose math expressions that will be fed into the simulation blocks. The underlying tech uses compute shaders and CPU bytecode to reduce all the computing to the extreme minimum (memory footprint, computation, rendering).
GitHub Method : You can access this editor right now as a preview package by cloning the github repository of the scriptable render pipeline. For now it’s a temporary rough step but in the next weeks you should be able to use the package directly from the package manager in the editor. When the time comes we shall update instructions for more easy steps to get it
We already thank you for your participation in helping us making this happening, if you do stunning stuff, please share it with the hashtag #VisualEffectGraph
How do I install it exactly? I pasted whole VFXGraph folder into my project Assets folder. I am getting tons of errors such as
Shader error in ‘Temp_compute_b_update_Runtime.compute’: failed to open source file: ‘Packages/com.unity.visualeffectgraph/Shaders/Common/VFXCommonCompute.cginc’ at kernel CSMain at System.vfx(21) (on d3d11)
Question is, can i write directly with C# into my particles? Custom nodes etc.?
The demo looks pretty impressive. I really like the whole hologram stuff:
If you need to write custom nodes, there’s an API that you can extend. (VFXBlock, VFXOperator) currently by placing your own classes in the com.vfx.visualeffectgraph/Editor hierarcy (all std library nodes and blocks are located in the Models subfolder.
Later on we shall provide a zero-code solution so artists can do the same, without the hassle of using C#.
Thanks for the kind words about the demo It was really fun making these holographic effects
Oh, got it. It will be a bit chaotic as I am testing while i’m writing so sorry bout that!
UI is quite messy at this point - it’s hard to understand what’s what and it needs a lot of visual polish (but I guess that’s to be expected at this stage)
Exposed parameters have no UI representation once in the graph. I would expect Texture Param to have a texture icon on its node and Color Param should be colorized to the defined Color
Node selection window is hard to read and hard to find nodes I need
Are Texture2D objects old-school dx9 tex2d things? How do I go about using separate samplers and texture objects?
Hi Gaxx, small precisions about your instructions.
If you only get the visualeffectgraph you will probably have a lot of broken rendering features.
Right now, the best bet is to also use the HD Render Pipeline code that’s in sync with the repository so you can benefit of the Lit Particles. So right now you need to include SRP Core, HD Render Pipeline and ShaderGraph as well
Where is GradientHDR? I can have HDR Color but not HDR Gradient as a param?
Dragging nodes around is hard. I have to click the border of the node, otherwise it tries to draw the connector. I think nodes should be draggable if you click&hold any part of the node. Connector should only be accesible if you click on the circle output pin
Current AnimationCurve UI is a step back compared to Shuriken. It has no bounds so it’s hard to work in 0-1 space
it would be nice to have an option to drag a node with all its input nodes connected to that node. Right now I have to move over my spaghetti by first selecting all connected nodes manually.
I would expect Params Categories to be reflected in Graph’s inspector. Right now I have a wall of params with no organization Update: it actually works but I had to re-open inspector window
How to I get Depth Buffer to set it in Collider (Depth) module?
Where do I access all debug info such as memory allocation cost, update cost in MS etc? Do I have to run profiler to check these numbers?
Thanks for all your feedback, we are indeed in a preview phase so all the UX and UI is not final at the moment, we try to improve it as much as we can to smoothen the experience so any feedback is appreciated.
For the node selection window, we advise to try to discover more by using the search field as it contains a lot of variants for all you need. For instance typing “Scale col” shall bring you both Scale Color, Scale Color from gradient (over life) and Scale Color from Map. Still, this menu is meant to be improved
Congrats on the release guys ! at last !
The spaceship demo kicks serious ass !
I also really like those butterflies & misc effects in the other videos
it would be nice to have an option to drag a node with all its input nodes connected to that node. Right now I have to move over my spaghetti by first selecting all connected nodes manually.
Big +1 on that one. It’s something we had in the very old PopcornFX v0.1 nodegraph, and it proved SUPER useful.
If I can suggest something: not just an option to move the nodes, but actually select them.
This way you can move them, but also copy/cut/delete them. very useful.
From what I remember it was done with a Shift+Click, so you could also Ctrl+Shift+Click to append other unrelated sub-branches to your selection. But maybe that’s not so important in the context of Unity’s VFX graph.
Hi, I would like to know how to create vector fields in Unity. There are no tutorials on the Internet at all. Can you make at least a small feedback how you created a vector field for the Unity logo?
hi guys, I think it is use a volume texture to create the vector fields, And I found this tools to make volume texture. but I don’t know how to use it… I had try a lot of times, but also fail to create a useable volume texture…
if someone want to try, click here! GitHub - mattatz/unity-volume-rendering: Volume rendering by object space raymarching for Unity
and if you make it work to create the useable volume texture to control the vector fields, please tell me how it work!
It’s pretty much what you described (it’s a SOP actually), we’re pretty close to ship these exporters (final steps of wrapping up so it’s easier to install and use).
Hello guys. I’ve been testing the vfx graph for an hour and I haven’t found the way to emmit from a mesh, do you know if it’s possible? I’ve also been testing Project on Depth(position) using a depth render textures and the result haven’t been what I expected. Vfx graph looks awesome anyway, I need to play more with it.