Motion Vectors (for animated textures), How do you generate them?

Can’t seem to get good motion vector maps out of Fume or Houdini. How do you guys generate them? What’s your super secret technique or plugin? Our current main issue is lack of motion data from one frame to the next as the shape expands. Shader part’s fine! XD

4 Likes

http://facedownfx.com/

could be a great solution, when it’s done.

4 Likes

that looks promising especially because of the stylized results u can get if its true :open_mouth:

Biggest issue is… its not out yet haha!! But yeah that’d be exactly what I’d like. In the mean time, I know some of you guys have existing workflows :smile:

Do you have example of how the motion vectors are not meeting expectations, or a more detailed description? Does “lack of motion data” mean complete lack of motion vectors, or just insufficient ones?

One thing you could try, from the top of my head, would be getting the motion vector from the closest hit when rendering the volume. If you’re averaging all the motion samples from one viewing ray, that could dilute the resulting vector - depending on the density and motion.

1 Like

I’ve run into this. Your problem, coming out of houdini is that you have the velocity field as motion vectors. While cool, that’s not what you need for the main part of motion vector set up. It can be used for additional detail, but that’s less important.

What you need is the optical flow of your sequence. A 2d analysis of the difference between frames. That’s what will allow you to blend between them.

To generate that, you can either use the Twixtor pro plugin for After Effects, or the NukeX version of oflow. Or, you can wait for Slate, from facedownfx. I have only used the flipbook part of that software so I can’t speak for the motion vector quality.

If you want to read more about this, check the Realtime VFX Dictionary Project. There’s a discussion about this very thing in there, started by Klemen Lozar (who’s tutorial I’m guessing you are following).

5 Likes

This has actually been enlightening about houdinis velocity pass, ive been experimenting at work trying to see if i can comp a pass that works, but and while im very close i havent quiet been able to solve some of the jittering issue motion vectors can cause, you can definitely feel the blending smoothness tho.

Ill try to get a comp to present here, maybe we can collaborate to get something working, @eetu would love your input as well.

-A

1 Like

Sure!

I just tried the motion vector generator in Facedowns Slate and it works a charm. I don’t have a shader set up on this rig to test it right now, but it’s in the right range, colours pointing the right way and in the right position. Beaut of a program that is.

1 Like

is Slate out!? or is this internal testing :slight_smile: ?

1 Like

Ehm… I have friends…

I’m not affiliated with it, but I managed to get my paws on an early version. It’s not out yet but it’s Coming Soon. Facedown is going to make a lot of vfx artists very happy.

4 Likes

@mattorialist and Co. over at Naughty Dog came up with a really cool way to do it just based on density.

In that dictionary thread, the conversation they had about it talks a good bit about what they do.
I worked to convert that over for Klemen into a material function for UE4 that we used a bit, but not a lot, because you trade texture memory for instruction count. We opted for texture memory in practice.

Here is a custom node that does all of that, although it is super finicky (that magic number optical scale is something I never really got the hang of just knowing.) Also I wouldn’t be surprised to hear that ours isn’t as good as the Uncharted folks turned out, as i said we mostly ended up just letting Klemen create the velocity manually himself instead of relying on this realtime generated velocity, so we may have actually missed something important. :stuck_out_tongue:

/////////////////////////////////////////////////////////////////
/////OPTICAL FLOW

//Realtime Optical Flow
//Based on Naughty Dog Siggraph 2015 Presentation:
//Chasing Film in 5ms
//Matthew Radford

//Requires 4 Inputs
//Tex - FlipbookTextureSample
//SubUVSampler - TextureSamplerSubUV
//o_FlowScale - Amount to scale vectors. "Magic" number.
//            - This number is the hardest part. It is the percentage of the texture that is the maximum delta between the previous frame and the next.
//            - Too little, and there will be a lot of stepping, too much and you get pulsing.
//Index - Channel to base the o_Flow on. 

//This is made for RGB packed flipbooks.
float g_sampleDistance = 0.01;       //      - width of sample.
float g_sampleOffset   = 0.001;      //      - ... not sure what this is for. Maybe to avoid a divide by zero?

Index = floor(Index) % 3;
                                       


float2 pFrameUV = Parameters.Particle.SubUVCoords[0].xy;
float2 cFrameUV = Parameters.Particle.SubUVCoords[1].xy;
float3 frameDiff = (Parameters.Particle.SubUVLerp).xxx;

// calculate gradient of time... if we know the flipbook scale this sample distance could be proportional.
float offset = g_sampleDistance;
float2 offsetX = float2(offset,0.0);
float2 offsetY = float2(0.0,offset);

float3 gradientX = Texture2DSampleLevel(Tex, TexSampler, cFrameUV + offsetX, 4) -
                   Texture2DSampleLevel(Tex, TexSampler, cFrameUV - offsetX, 4) +
                   Texture2DSampleLevel(Tex, TexSampler, pFrameUV + offsetX, 4) - 
                   Texture2DSampleLevel(Tex, TexSampler, pFrameUV - offsetX, 4);
                                                                                                            
float3 gradientY = Texture2DSampleLevel(Tex, TexSampler, cFrameUV + offsetY, 4) -
                   Texture2DSampleLevel(Tex, TexSampler, cFrameUV - offsetY, 4) +
                   Texture2DSampleLevel(Tex, TexSampler, pFrameUV + offsetY, 4) -
                   Texture2DSampleLevel(Tex, TexSampler, pFrameUV - offsetY, 4);  
                   
float3 gradientMag = sqrt((gradientX*gradientX) +(gradientY*gradientY) ) + g_sampleOffset;

//convert gradient into motion vector
// flowscale needs to be hand tweaked to get the correct distortion
float3 velocityX_p = (frameDiff) * (gradientX / gradientMag) * o_FlowScale;
float3 velocityY_p = (frameDiff) * (gradientY / gradientMag) * o_FlowScale;
float3 velocityX_c = (1-frameDiff) * (gradientX / gradientMag) * o_FlowScale;
float3 velocityY_c = (1-frameDiff) * (gradientY / gradientMag) * o_FlowScale;


float2 pFrameUV_flow = pFrameUV + float2(velocityX_p[Index], velocityY_p[Index]);
float2 cFrameUV_flow = cFrameUV - float2(velocityX_c[Index], velocityY_c[Index]);
float3 color1 = Texture2DSample(Tex, TexSampler, pFrameUV_flow);
float3 color2 = Texture2DSample(Tex, TexSampler, cFrameUV_flow);
float3 color = lerp( color1, color2, frameDiff.x );
return color;
/////////////////////////////////////////////////////////////////

and the copy pasta’d node network. Sorry, you’ll have to get your own flipbook. :stuck_out_tongue:

Begin Object Class=MaterialGraphNode Name="MaterialGraphNode_297"
   Begin Object Class=EdGraphPin Name="EdGraphPin_390930"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390929"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390928"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390927"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390926"
   End Object
   Begin Object Class=MaterialExpressionCustom Name="MaterialExpressionCustom_32"
   End Object
   Begin Object Name="EdGraphPin_390930"
      PinName="Output"
      PinFriendlyName=" "
      Direction=EGPD_Output
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_305.EdGraphPin_390951'
      LinkedTo(1)=EdGraphPin'MaterialGraphNode_307.EdGraphPin_390955'
   End Object
   Begin Object Name="EdGraphPin_390929"
      PinName="Index"
      PinType=(PinCategory="required")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_300.EdGraphPin_390940'
   End Object
   Begin Object Name="EdGraphPin_390928"
      PinName="o_FlowScale"
      PinType=(PinCategory="required")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_299.EdGraphPin_390939'
   End Object
   Begin Object Name="EdGraphPin_390927"
      PinName="SubUVSampler"
      PinType=(PinCategory="required")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_298.EdGraphPin_390934'
   End Object
   Begin Object Name="EdGraphPin_390926"
      PinName="Tex"
      PinType=(PinCategory="required")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_306.EdGraphPin_390954'
   End Object
   Begin Object Name="MaterialExpressionCustom_32"
      Code="    /////////////////////////////////////////////////////////////////\r\n    /////OPTICAL FLOW\r\n\r\n    //Realtime Optical Flow\r\n    //Based on Naughty Dog Siggraph 2015 Presentation:\r\n    //Chasing Film in 5ms\r\n    //Matthew Radford\r\n\r\n    //Requires 4 Inputs\r\n    //Tex - FlipbookTextureSample\r\n    //SubUVSampler - TextureSamplerSubUV\r\n    //o_FlowScale - Amount to scale vectors. \"Magic\" number.\r\n    //Index - Channel to base the o_Flow on.\r\n    \r\n    //This is made for RGB packed flipbooks.\r\n    float g_sampleDistance = 0.01;       //      - width of sample.\r\n    float g_sampleOffset   = 0.001;      //      - ... not sure what this is for. Maybe to avoid a divide by zero?\r\n    \r\n    //Index = floor(Index) % 3;\r\n    //float o_FlowScale      = 0.006;      //      - Amount to scale vectors. \"Magic\" number.\r\n    \r\n    \r\n    float2 pFrameUV = Parameters.Particle.SubUVCoords[0].xy;\r\n    float2 cFrameUV = Parameters.Particle.SubUVCoords[1].xy;\r\n    float3 frameDiff = (Parameters.Particle.SubUVLerp).xxx;\r\n\r\n    // calculate gradient of time... if we know the flipbook scale this sample distance could be proportional.\r\n    float offset = g_sampleDistance;\r\n    float2 offsetX = float2(offset,0.0);\r\n    float2 offsetY = float2(0.0,offset);\r\n\r\n    float3 gradientX = Texture2DSampleLevel(Tex, TexSampler, cFrameUV + offsetX, 4) -\r\n                       Texture2DSampleLevel(Tex, TexSampler, cFrameUV - offsetX, 4) +\r\n                       Texture2DSampleLevel(Tex, TexSampler, pFrameUV + offsetX, 4) - \r\n                       Texture2DSampleLevel(Tex, TexSampler, pFrameUV - offsetX, 4);\r\n                                                                                                                \r\n    float3 gradientY = Texture2DSampleLevel(Tex, TexSampler, cFrameUV + offsetY, 4) -\r\n                       Texture2DSampleLevel(Tex, TexSampler, cFrameUV - offsetY, 4) +\r\n                       Texture2DSampleLevel(Tex, TexSampler, pFrameUV + offsetY, 4) -\r\n                       Texture2DSampleLevel(Tex, TexSampler, pFrameUV - offsetY, 4);  \r\n                       \r\n    float3 gradientMag = sqrt((gradientX*gradientX) +(gradientY*gradientY) ) + g_sampleOffset;\r\n\r\n    //convert gradient into motion vector\r\n    // flowscale needs to be hand tweaked to get the correct distortion\r\n    float3 velocityX_p = (frameDiff) * (gradientX / gradientMag) * o_FlowScale;\r\n    float3 velocityY_p = (frameDiff) * (gradientY / gradientMag) * o_FlowScale;\r\n    float3 velocityX_c = (1-frameDiff) * (gradientX / gradientMag) * o_FlowScale;\r\n    float3 velocityY_c = (1-frameDiff) * (gradientY / gradientMag) * o_FlowScale;\r\n    \r\n    \r\n    float2 pFrameUV_flow = pFrameUV + float2(velocityX_p[Index], velocityY_p[Index]);\r\n    float2 cFrameUV_flow = cFrameUV - float2(velocityX_c[Index], velocityY_c[Index]);\r\n    float3 color1 = Texture2DSample(Tex, TexSampler, pFrameUV_flow);\r\n    float3 color2 = Texture2DSample(Tex, TexSampler, cFrameUV_flow);\r\n    float3 color = lerp( color1, color2, frameDiff.x );\r\n    return color;\r\n    /////////////////////////////////////////////////////////////////"
      Inputs(0)=(InputName="Tex",Input=(Expression=MaterialExpressionTextureObjectParameter'MaterialGraphNode_306.MaterialExpressionTextureObjectParameter_4'))
      Inputs(1)=(InputName="SubUVSampler",Input=(Expression=MaterialExpressionTextureSampleParameterSubUV'MaterialGraphNode_298.MaterialExpressionTextureSampleParameterSubUV_13',Mask=1,MaskR=1,MaskG=1,MaskB=1))
      Inputs(2)=(InputName="o_FlowScale",Input=(Expression=MaterialExpressionScalarParameter'MaterialGraphNode_299.MaterialExpressionScalarParameter_24'))
      Inputs(3)=(InputName="Index",Input=(Expression=MaterialExpressionScalarParameter'MaterialGraphNode_300.MaterialExpressionScalarParameter_25'))
      MaterialExpressionEditorX=80
      MaterialExpressionEditorY=96
      MaterialExpressionGuid=391A367747B1D0C9577738A4D7E18C29
      Material=PreviewMaterial'/Engine/Transient.O_Flow'
   End Object
   MaterialExpression=MaterialExpressionCustom'MaterialExpressionCustom_32'
   Pins(0)=EdGraphPin'EdGraphPin_390926'
   Pins(1)=EdGraphPin'EdGraphPin_390927'
   Pins(2)=EdGraphPin'EdGraphPin_390928'
   Pins(3)=EdGraphPin'EdGraphPin_390929'
   Pins(4)=EdGraphPin'EdGraphPin_390930'
   NodePosX=80
   NodePosY=96
   NodeGuid=C9647EFB4F21552E8F1B499A3A1A3477
End Object
Begin Object Class=MaterialGraphNode Name="MaterialGraphNode_298"
   Begin Object Class=EdGraphPin Name="EdGraphPin_390938"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390937"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390936"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390935"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390934"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390933"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390932"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390931"
   End Object
   Begin Object Class=MaterialExpressionTextureSampleParameterSubUV Name="MaterialExpressionTextureSampleParameterSubUV_13"
   End Object
   Begin Object Name="EdGraphPin_390938"
      PinName="Output5"
      PinFriendlyName=" "
      Direction=EGPD_Output
      PinType=(PinCategory="mask",PinSubCategory="alpha")
   End Object
   Begin Object Name="EdGraphPin_390937"
      PinName="Output4"
      PinFriendlyName=" "
      Direction=EGPD_Output
      PinType=(PinCategory="mask",PinSubCategory="blue")
   End Object
   Begin Object Name="EdGraphPin_390936"
      PinName="Output3"
      PinFriendlyName=" "
      Direction=EGPD_Output
      PinType=(PinCategory="mask",PinSubCategory="green")
   End Object
   Begin Object Name="EdGraphPin_390935"
      PinName="Output2"
      PinFriendlyName=" "
      Direction=EGPD_Output
      PinType=(PinCategory="mask",PinSubCategory="red")
   End Object
   Begin Object Name="EdGraphPin_390934"
      PinName="Output"
      PinFriendlyName=" "
      Direction=EGPD_Output
      PinType=(PinCategory="mask")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_297.EdGraphPin_390927'
   End Object
   Begin Object Name="EdGraphPin_390933"
      PinName="Level"
      PinType=(PinCategory="optional")
   End Object
   Begin Object Name="EdGraphPin_390932"
      PinName="Tex"
      PinType=(PinCategory="optional")
   End Object
   Begin Object Name="EdGraphPin_390931"
      PinName="UVs"
      PinType=(PinCategory="optional")
   End Object
   Begin Object Name="MaterialExpressionTextureSampleParameterSubUV_13"
      ParameterName="PrimaryTextureSamplerSubUV"
      ExpressionGUID=9608F8FD482A919E5A7D76A64B6EBB26
      MipValueMode=TMVM_MipLevel
      ConstMipValue=0
      MaterialExpressionEditorX=-368
      MaterialExpressionEditorY=48
      MaterialExpressionGuid=33436536400A85B0A0C3258620BE8B40
      Material=PreviewMaterial'/Engine/Transient.O_Flow'
   End Object
   MaterialExpression=MaterialExpressionTextureSampleParameterSubUV'MaterialExpressionTextureSampleParameterSubUV_13'
   Pins(0)=EdGraphPin'EdGraphPin_390931'
   Pins(1)=EdGraphPin'EdGraphPin_390932'
   Pins(2)=EdGraphPin'EdGraphPin_390933'
   Pins(3)=EdGraphPin'EdGraphPin_390934'
   Pins(4)=EdGraphPin'EdGraphPin_390935'
   Pins(5)=EdGraphPin'EdGraphPin_390936'
   Pins(6)=EdGraphPin'EdGraphPin_390937'
   Pins(7)=EdGraphPin'EdGraphPin_390938'
   NodePosX=-368
   NodePosY=48
   bCanRenameNode=True
   NodeGuid=3DB0D7C8489D075BE3A4D0936CFABD9C
End Object
Begin Object Class=MaterialGraphNode Name="MaterialGraphNode_299"
   Begin Object Class=EdGraphPin Name="EdGraphPin_390939"
   End Object
   Begin Object Class=MaterialExpressionScalarParameter Name="MaterialExpressionScalarParameter_24"
   End Object
   Begin Object Name="EdGraphPin_390939"
      PinName="Output"
      PinFriendlyName=" "
      Direction=EGPD_Output
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_297.EdGraphPin_390928'
   End Object
   Begin Object Name="MaterialExpressionScalarParameter_24"
      DefaultValue=0.000900
      ParameterName="o_FlowScale"
      ExpressionGUID=ED53FA6C4C1625559967ACA6457F0FEF
      MaterialExpressionEditorX=-176
      MaterialExpressionEditorY=304
      MaterialExpressionGuid=5E141B7E4DAB0A891296E8BB39108A55
      Material=PreviewMaterial'/Engine/Transient.O_Flow'
   End Object
   MaterialExpression=MaterialExpressionScalarParameter'MaterialExpressionScalarParameter_24'
   Pins(0)=EdGraphPin'EdGraphPin_390939'
   NodePosX=-176
   NodePosY=304
   bCanRenameNode=True
   NodeGuid=0F4C332A4C78F3DE22C32187DF6799D2
End Object
Begin Object Class=MaterialGraphNode Name="MaterialGraphNode_300"
   Begin Object Class=EdGraphPin Name="EdGraphPin_390940"
   End Object
   Begin Object Class=MaterialExpressionScalarParameter Name="MaterialExpressionScalarParameter_25"
   End Object
   Begin Object Name="EdGraphPin_390940"
      PinName="Output"
      PinFriendlyName=" "
      Direction=EGPD_Output
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_297.EdGraphPin_390929'
      LinkedTo(1)=EdGraphPin'MaterialGraphNode_305.EdGraphPin_390952'
   End Object
   Begin Object Name="MaterialExpressionScalarParameter_25"
      ParameterName="Index"
      ExpressionGUID=5B22345145D8B39EB534FA84B54CE3AF
      MaterialExpressionEditorX=-176
      MaterialExpressionEditorY=416
      MaterialExpressionGuid=6BAEC8EB4CFF55C8B8BC89BF4E170945
      Material=PreviewMaterial'/Engine/Transient.O_Flow'
   End Object
   MaterialExpression=MaterialExpressionScalarParameter'MaterialExpressionScalarParameter_25'
   Pins(0)=EdGraphPin'EdGraphPin_390940'
   NodePosX=-176
   NodePosY=416
   bCanRenameNode=True
   NodeGuid=8977E8EC4253303437AC17A250FBB29C
End Object
Begin Object Class=MaterialGraphNode Name="MaterialGraphNode_305"
   Begin Object Class=EdGraphPin Name="EdGraphPin_390953"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390952"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390951"
   End Object
   Begin Object Class=MaterialExpressionCustom Name="MaterialExpressionCustom_33"
   End Object
   Begin Object Name="EdGraphPin_390953"
      PinName="Output"
      PinFriendlyName=" "
      Direction=EGPD_Output
   End Object
   Begin Object Name="EdGraphPin_390952"
      PinName="index"
      PinType=(PinCategory="required")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_300.EdGraphPin_390940'
   End Object
   Begin Object Name="EdGraphPin_390951"
      PinName="RGB"
      PinType=(PinCategory="required")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_297.EdGraphPin_390930'
   End Object
   Begin Object Name="MaterialExpressionCustom_33"
      Code="return RGB[index];"
      Description="ChannelPicker"
      Inputs(0)=(InputName="RGB",Input=(Expression=MaterialExpressionCustom'MaterialGraphNode_297.MaterialExpressionCustom_32'))
      Inputs(1)=(InputName="index",Input=(Expression=MaterialExpressionScalarParameter'MaterialGraphNode_300.MaterialExpressionScalarParameter_25'))
      MaterialExpressionEditorX=352
      MaterialExpressionEditorY=208
      MaterialExpressionGuid=1DAC7D3F48B2A472D048F9A6BECBD34E
      Material=PreviewMaterial'/Engine/Transient.O_Flow'
      bCollapsed=True
   End Object
   MaterialExpression=MaterialExpressionCustom'MaterialExpressionCustom_33'
   Pins(0)=EdGraphPin'EdGraphPin_390951'
   Pins(1)=EdGraphPin'EdGraphPin_390952'
   Pins(2)=EdGraphPin'EdGraphPin_390953'
   NodePosX=352
   NodePosY=208
   NodeGuid=774941684E8885B938BD989B8E6E00B6
End Object
Begin Object Class=MaterialGraphNode Name="MaterialGraphNode_306"
   Begin Object Class=EdGraphPin Name="EdGraphPin_390954"
   End Object
   Begin Object Class=MaterialExpressionTextureObjectParameter Name="MaterialExpressionTextureObjectParameter_4"
   End Object
   Begin Object Name="EdGraphPin_390954"
      PinName="Output"
      PinFriendlyName=" "
      Direction=EGPD_Output
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_297.EdGraphPin_390926'
   End Object
   Begin Object Name="MaterialExpressionTextureObjectParameter_4"
      ParameterName="Texture"
      ExpressionGUID=8567F96A45EA2A6585A46988239F3D73
      Texture=Texture2D'/Game/Developers/kllozar/Textures/Explo1_Main.Explo1_Main'
      MaterialExpressionEditorX=-208
      MaterialExpressionEditorY=-144
      MaterialExpressionGuid=084813EC4BBF0D45FFA69783D4D0CCA0
      Material=PreviewMaterial'/Engine/Transient.O_Flow'
   End Object
   MaterialExpression=MaterialExpressionTextureObjectParameter'MaterialExpressionTextureObjectParameter_4'
   Pins(0)=EdGraphPin'EdGraphPin_390954'
   NodePosX=-208
   NodePosY=-144
   bCanRenameNode=True
   NodeGuid=852618F949AE9801FC2575BBB388E8C4
End Object
Begin Object Class=MaterialGraphNode Name="MaterialGraphNode_307"
   Begin Object Class=EdGraphPin Name="EdGraphPin_390956"
   End Object
   Begin Object Class=EdGraphPin Name="EdGraphPin_390955"
   End Object
   Begin Object Class=MaterialExpressionComponentMask Name="MaterialExpressionComponentMask_18"
   End Object
   Begin Object Name="EdGraphPin_390956"
      PinName="Output"
      PinFriendlyName=" "
      Direction=EGPD_Output
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_Root_9.EdGraphPin_390895'
   End Object
   Begin Object Name="EdGraphPin_390955"
      PinName="Input"
      PinFriendlyName=" "
      PinType=(PinCategory="required")
      LinkedTo(0)=EdGraphPin'MaterialGraphNode_297.EdGraphPin_390930'
   End Object
   Begin Object Name="MaterialExpressionComponentMask_18"
      Input=(Expression=MaterialExpressionCustom'MaterialGraphNode_297.MaterialExpressionCustom_32')
      G=True
      MaterialExpressionEditorX=368
      MaterialExpressionEditorY=96
      MaterialExpressionGuid=BD5A183D46943C47E3830EBB947B9CE2
      Material=PreviewMaterial'/Engine/Transient.O_Flow'
   End Object
   MaterialExpression=MaterialExpressionComponentMask'MaterialExpressionComponentMask_18'
   Pins(0)=EdGraphPin'EdGraphPin_390955'
   Pins(1)=EdGraphPin'EdGraphPin_390956'
   NodePosX=368
   NodePosY=96
   NodeGuid=B0952C824AA402F4B97CFA8DF9087739
End Object

This version requires that you use a particle subuv with the Interpolation Method set to LinearBlend.

It also relies on a hack that if there is a SubUV particle sampler somewhere in the network it exposes the frame information in Parameters.Particle.SubUVCoords for use elsewhere. Pretty hacky, but it works.

You could obviously convert this code to take in a time value and do the lookup yourself if you are so inclined. I wanted it to work with any particle system, although we never got around to doing the piping the right way, which would be to make this just another interpolation mode with the o_FlowScale being set on the particle system itself.

Word of advice: that flowscale value for us is incredibly hard to figure out, as it is functionally based on the maximum amount of change between this frame and the next… unfortunately it can cause all sorts of weird pulsing inside where the motionvectors aren’t accurate.

12 Likes

omg a credit. im honored!

4 Likes

Considering I took it whole cloth from your psuedo code talk, I felt it was only fair. Your name is in our game.

Well… considering comments get stripped from the compiled code, it isn’t really… but still.

Aw. too kind. Im really just a fraud.

the flowscale thing is pretty dicey. Since the intensity of the morphing is based around differences in luminance the effect can fall apart sometimes when there are dramatic changes in luminance over the course of a sequence. I usually found the part of the sequence I was most concerned about, would tweak it to that, then let the rest slip.

anyone who finds a way to keep the luminance consistent to achieve better morphing gets the 2.0 version of this.

here’s a video showing what it should do. I LOVE this shit… but i’m lazy and hate the idea of baking motion vectors for every flipbook. In practice though that’s probably what you should do. but… sooo lazy i am.

one thing we experimented with and never did anything final with was using it to blend 360 degree turntables of something. so render an object from N number of directions around an axis, and the morphing did a decent enough job. we got it using as few as 36 images to get a relatively unnoticeable 360 degree representation of something.

4 Likes

I’ve been trying to implement Klemen Lozar’s method for creating motion vectors in Unity. The result is not pleasing and I don’t understand why or what should I change to get this thing working properly. Since I’m not much of a shader guy and I can only use Shader Forge for this kind of stuff I hope someone will bear with me.

First I’ve created the simple frame blending shader - lerping between the next frame and current frame. This works flawlessly. It works so well that I wonder why I’vent been doing this earlier. Simple and fast.

Then I’ve simed some fast FumeFX smoke, generated velocity pass, created motion vectors in AE and slapped it all in Unity trying to figure out the rest from Klemens post.

Then the shader part - I think I’ve created the exact copy of this but it nowhere near works as well as the one in the post.

Here is a link to video comparing frame blending and motion vectors: Frame Blending & Motion Vectors

I think that the frame blending example works better than the one with motion vectors - and it’s waaay chaper :slight_smile:

Here is also the SF node tree: Node Tree

I can see some alpha artifacts, the motion is nowhere near as fluid and I’ve no idea what the distortion value should be - that would be my basic problems with this :slight_smile:

I’ve tried to invert red channel in the motion vectors texture but this doesn’t seem to help with anything except changing the direction of the flow. Maybe the wrong compression? But I’ve selected RGB 16 bit/RGBA 32bit but it doesn’t change a thing. I’ve obviously made mistake somewhere or I don’t understand it well enough. Any help would be appreciated :slight_smile:

1 Like

Would it Not be possible to Remap the velocity pass it to @Cd and use a Sop Solver to get acces to the previous frame
and do the difference between frames ?

@Orson : Actually this is possible, but tricky as the voxel densities from one frame to another are not necessarily the same, and your pixels can become occluded by nearest voxels, so the @Cd would sometimes come from another voxel.

From what I’ve currently experimented: I’ve setup a temporary workflow by using a Velocity pass in houdini based on a custom SHOP that assumes that density is really high (so per-texel velocity will be the result of only the nearest visible voxel), then after rendering, I transform It in a COP from world space to project space and export the values as a simple flowmap.

Applying this to shader is quite easy : just a flowmap blending from one frame to another. The only tricky part is to set up the scale to apply from one frame to another. (Small values that you don’t wand pre-encoded in your motion vector maps to avoid losing range precision)

This method has pros and cons: pros are it’s really efficient for explosions, and dense smoke, as it’s not optical flow, but actual simulation velocities transformed. Cons are these values are assumed to be unblended, first-front-voxel, velocities and do not work with densityless rendering such as flames.

The problem with motion vectors rendered from volumes is that you can have multiple velocities for one pixel, that a simple vector could not handle (especially if velocities are divergent, blending by raymarching would approximate them to a zero-length vector).

1 Like

Yes, but it still won’t give the result you want. A voxel at the edge of the sim might be on the way left but a voxel one row in might be heading straight out at five times the speed. Next frame, the velocity at the edge will have turned 90degrees. This is why an optical flow works. It just looks at where were the contrasty points last frame, where did it go? Well, sort of.

The velocity field can be used for internal detail but the main part of the motion vector blend is the edges and how it blends to the next frame.

1 Like

From what I can tell, you have a shader issue.It looks like you are applying the uv distortion to both currentFrame and nextFrame. So they chase eachother out and then simply alphablend inbetween. Then once the nextFrame turns into currentFrame it snaps back to its undistorted position.

What should happen: currentframes uvs are pushed towards a static nextFrame. Once it’s there you make that one the currentFrame and push towards the next frame.

1 Like

@Partikel thank you for reply!

The shader is pretty straight forward. The main idea is to build custom SubUV cross-blending, similarly to what Particle SubUV expression does automatically but we’ll need to do it manually to control the next step. In addition to interpolating from one subimage to the next we’ll need to distort the current subimage pixels towards the position of the next and similarly distort the next one back towards the current one so they kind of meet in the middle. We’ll achieve this distortion using the motion vector texture we’ve made.

Here’s the completed shader network we’ll be creating followed by a breakdown:

Klemen material - Motion Vectors

I think he did that on purpose. I will try to experiment with your suggestion. Worth a shot.

Anyway - I was researching this all night and I’ve achieved much better results by inverting both channels and experimenting with Max Velocity in Fusion Works render pass, Maing_BG Sensitivity in Twixtor + displacement value in the shader. Figured out that a lot depends on the distortion value. I’ve determined the value by mixing the info in the comments on Klemens site and empirical experience.

Proper formula for this was to take size of one tile (512 in my example) divide it by the Max Displace value used in Twixtor Vectors (32) wich gave me value o 16. That said I’ve typed value of 0,0016 in the shader and everything works suprisingly well.

Anyway - here is the result: Motion Vectors New!

As you can see the result it is far better than before but I still get this feeling of stepping - especially on lower speeds. This is the only thing that is different between mine and Klemens implementation as far as I can tell. He can push his slowmo really far and it is still smoooooth as butter :slight_smile:

3 Likes