Im pretty new to shaders and effects and I’m having a hard time finding resources to learn from.
I have a 2 part question.
I’m using Amplify Shader for unity and most of the nodes are not very well documented. I would like to know if there are some books or tutorials that could be helpfull for a newcomer to node based shader editing. I’m looking for use cases for different nodes and explanations on different terms.
In what tool would sombody be able to make some of the realistic flame, fluid and fume animations that are found in asset packs like these Unity Asset Store - The Best Assets for Game Making? Im mostly using after effects to make sprite sheets but i see that some of the effects look too high res for a small spritesheet.
I would really appreciate any help i can get. Thank you in advance.
Don’t have an answer for your first question, but for question 2 Houdini is a popular one. I personally haven’t gotten a lot into sims but when I have I used Maya and FumeFX. Actually taking another look at that video you posted (wasn’t loading before) a lot of that stuff can be achieved within the shader/material (minus that ink effect at the end which is obviously a sim). I’m not a Unity user but there seems to be a lot of cool work being done using ShaderForge in Unity. @Sirhaian released a lot of cool work that he has done that might serve useful to study: Releasing my League VFXs Fan-Arts for Study Purposes - #21 by CellarPhantom
Didn’t really answer your questions but hope this helps
Question 1 Amplify Shader nodes most like the nodes of unreal engine.So you can go the official AmplifyShaderNodeswebsite to search what means of it
Question 2 You can go up to the maker kripto289 to ask what he use,Usually Houdini or FumeFX used to simulating the Fluid Smoke or Cloud
PS:ShaderForge is also the Shader Pulgins For Unity
Thank you. I have read the documentation for amplify shader but most of the nodes don’t mean much to me becaus i don’t understand the terms they stand for. I am yet to find a really good tutorial that explains what most nodes do by themselves. Maybe you dont get what I mean. For example i drag out the World Normal node, It’s description states: “Per pixel world normal vector” and i dont understand from what pixel. From where is the pixel exactly pointing its vector from?
I lack the fundamental knowladge and would like to have a good starting point to learn from.
Node based shader editors help with some parts of writing shaders, primarily when fiddling with values to get a specific look rather than trying to do something specific mathematically. When you start getting into things like “per pixel” and “normal vectors” you’re scratching the edges of requiring a more complete knowledge of what shaders are and how they work … and some basic vector math to boot.
To answer this specific question of what does “Per pixel world normal vector” mean:
The pixel in question here is the screen pixel of the object being rendered. Usually when it comes to graphics terminology when someone uses the term “pixel” they’re referring to a pixel on the screen, or perhaps a pixel in the texture being rendered to like in the case of render textures. If someone is talking about pixels from a texture we use the term “texel”, aka “texture pixel”.
Why per pixel vs something else? This gets into how normals are calculated for meshes. The most basic method is linearly interpolated vertex normals. Each vertex stores a direction that it’s facing, and for each pixel drawn of a triangle being rendered it calculates a normal that is a blend of the three normals recorded in each vertex for that triangle. If you want to understand that more look up barycentric interpolation… then shake your head as almost all of the documents you find will be super math heavy for something that’s actually fairly easy to understand.
That’s a triangle with each vertex a different solid RGB color, the colors in between are the result of barycentric interpolation.
Now I just said this is calculated for each pixel, but this is not necessarily what we’re talking about when we say per pixel normal, this is just the interpolated vertex normal, but they could be the same. The real difference is when you’re using normal maps. Since normal maps modify the normal using a texture, it can’t be something done at each vertex and interpolated to get the result you expect. So it has to be calculated per pixel … and thus is the per pixel normal.
World normal here is also relevant because the normals on the vertices of the mesh are in object / mesh space. If you rotate the mesh, or scale it non uniformly, they’ll no longer match world space, so they have to be transformed from object space to world space. Normal maps are also not in world space (usually), they’re in tangent space which is a term that means relative to the surface normal (aka interpolated vertex normal) and the orientation of the texture UVs. If you want to try to wrap your head around that, there are plenty of places to start, but I suggest you start someplace a little simpler.
I suggest starting with this tutorial, which goes into both the basics of shaders, and some specifics on Unity. It’ll go into surface shaders and vertex / fragment shaders, and lots of stuff that node based editors abstract or straight up hide from the user, but it’s still useful to read.