No GDC for me this year, too much going on - so this is as much as I could do for the moment. Hopefully interesting to ya’ll and if there any questions let me know and I’ll see what I can answer.
Read it a couple of days ago and it blew my mind in so many areas. The leaves system and bending foliage use particles was really cool to see. Our studio is about to start R&D for the next project pretty soon and i’m super excited to try out similar techniques.
Awesome, glad people found it helpful. It was a full 180 degree turn from Second Son so the types of things we were asked to do were quite different. Second Son was like 80/20 character/env split, Ghost pretty much flipped that on it’s head with a small exception for a bit of Legends mode work. There’s so much amazing work out there (and a lot of new super-talented people on here since I last visited!) I’m happy to still feel relevant, haha. Ghost was positively massive relative to Second Son and much of the dev time I was the sole VFX artist so I was trying to come up with ways to create self maintaining / stable content that looked good and had a ton of re-use.
Some awesome systems there and some really nicely executed effects!
I was wondering, are you creating a global force setup where you write the fire updraft into a global field or do the particle systems pass the force information between them? If the particle systems pass data, is that done automatically (for instance, based on distance) or do you manually need to tell what particle systems communicate with each other.
Also, for driving particles based on the river flow, do you actually read data from the river mesh (like flow data on the vertices) or do you generate a vectorfield for the rivers so you can read that data?
@jameswatt There is a global set of data for the wind fields that you can write into. Particle emitters by default ignore this field, so you need to opt-in to read it. Particle systems can read a bunch of data from the surrounding game world, but it’s not a user-defined set from the VFX end. It’s always data that I’ve asked the code team to provide. Things like terrain or water position, world wetness from rain, etc. Emitters themselves don’t communicate with each other outside of wind or displacement - it’s something we might pursue on our next project if need arises.
We used wind emitters around hero/npcs to scare away particle animals as well as things like the fire example. The wind was used in conjunction with our event system, which can do things like set animation state on a bird (idle, takeoff, flying) or add turbulence to leaves based on wind speed. Because of the way our expression system works it’s really easy to do something like make a threshold test and have on/off switches, map values so it’s gradual or create a keyframe graph that controls a particle input after the condition is met.
In terms of the river flow, we’re reading from a set of flow data baked to textures. It’s not the highest resolution and there are still plenty of cases where a particle will flow through rock/ground, but it’s good in most of the cases. I like to call 80% functionality “good enough for massive open world.” Like, it’s good enough on balance to do it vs not doing it over the minority of smaller visual bugs. We’re not using any realtime SDF data or anything here, though that’s the direction I’d like to pursue for the next project so we can get more accurate rock deflection. Potentially a mix of artist painted flow + SDF would give a good combo using current tech.
Cheers for the insight, and a beautiful looking game!
Mind if I ask real quick - the blood in earlier videos looked very much like it was prebaked VAT or ABC style caches… blobs and streaks… but the blood in water demo video seems like a different blood, less streaky and is of course realtime? … just wondering how the streaking blood was done and is that realtime?!
Cheers!
If I understand correctly you’re compared the demo video in the blog with content we’ve shown in trailers. The blood from katana hits is fairly straight forward in Ghost, it’s a combination of deferred ribbons and deferred billboards, no meshes. We generate runtime normals based on the edges of the erosion so the normal edge follows the eroding shape. Some blood comes from the target being hit and some comes off the sword as it travels, it’s a mix of both sources.
The main reason the blog blood looks different is that the effect was designed to work with the velocity of the katana, so to get a demo working I’m actually faking a velocity in a direction and I ended up disabling the ribbon sections. The streaky bits you’re referring to are probably the ribbons that got disabled, they show up in normal gameplay. Hopefully that helps explain it!
@mattv Hey, thanks for the detailed reply! Thats really interesting info.
I really like the idea of a dynamic windfield for a game, but I’ve never used one yet. It really adds a layer of believability when particles react to forces from other sources. I suppose you need to just sample the windfield data around the area of the player position, so you can use a sub-set of the global windfield data.
And for the river you are reading from the river flowmap texture or some lower resolution global texture? I suppose the main thing to implement is reading into the right part of the texture based on particle location in world space. Will have to have a think about that one since that should be doable in Niagara currently.
Well, I think its great to see so much refinement done to environment effects. Often those type of effects are secondary to all other effects in a game but this shows that putting in the time to work on environment effects can really bring a world to life.
If your project has global distance fields enabled, you add the distancefield gradient to your particles to push them away from object, and thus the terrain.
Niagara particles can also raycast so you could do it using that.
Or you can render out the heightfield of the terrain, sample it from the particle at particle position/total size and check how close the height of particle is to the heightmap. Too close? move up a bit.
Might not be able to do exactly what they did, but there’s still plenty of options.
Sadly the gifycat site is no more so the lovely gifs in the article are lost to the void. Did anyone save it out and if so, could you repost it? @mattv any chance you still have it lying around and can put it somehwere?