Dunk's Sketchbook

[Thread thumbnail]
AnimThumbnail

Hi Folks,
I’ve been trying to do “EveryDay” VFX / tech art sketches in my spare time since Nov last year. Managed about a 2 in 3 rate as just can’t do it every day, but I’m still quite happy with some of the doodlings and experiments that has come out of it.

This month, current experiments mostly been with LIDAR point cloud data from NYC audio-reactive inputs - e.g. today’s is:

You can scroll though them from the start in this YouTube playlist if you feel inclined (some of the days missing if it was static image e.g. Aerialod maps/heightmap renders):
https://www.youtube.com/playlist?list=PLMqpdxi5SkXLHl5REBvOPOnOfW13muL80

I was primarily tweeting the dailies: https://twitter.com/duncanfewkes
But since that site’s going to crap, I’m trying out Mastodon: dunk (@dunk@mastodon.gamedev.place) - Gamedev Mastodon

Not sure if I’ll be spamming this thread with every daily, or just with ones that I’m particularly chuffed with.
Any feedback, ideas etc. more than welcome! I already have a massive ideas.txt file, but I find it handy to have cool ideas on a backburner for when I get tired of whatever I’m messing about with, so the more the merrier :slight_smile:
Cheers!

9 Likes

#EveryDay 289: Running through the current audio-reactive point cloud experiments. First 2 are most polished state, but still scope for refining colours. Will need to trim out redundant test variants of the pull/push forms - will keep the “volumetric” ones, and maybe work on nicer colour gradients, as the ones in there are pretty much stock Unity vfxgraph ones with tweaked brightness.

2 Likes

Tweaked colour gradients a bit to square it away for these point cloud vfx:

Started on more variants of fluid sim motion triggered particles:

2 Likes

Testing on the big screen at work:

2 Likes

“Whaaaaaaat” is my reaction watching this, very very cool :star_struck:

1 Like

#EveryDay 296: Shiny fish variant. No schooling/boids behaviour, still spawning from fluid sim velocity buffer and then motion from turbulence noise field (modulated by fluid sim speed).

1 Like

“Swishy Fishy”:

1 Like

Testing the fish particles on the LED wall:

2 Likes

wow, that looks nice! you should make a GIF and add it as first image in your first post so that it gets taken as thumbnail (or manually upload it as a thumbnail). I almost didn’t click your thread because there was no enganging thumbnail :smiley: But there ware very interesting experiements in here!

1 Like

Thanks! Good tip on the gif thumbnail, will do at some point.
If you like these ones check my YouTube for the other 300 or so videos :wink:

1 Like

looks super interesting! great stuff

I’ve seen those LED walls used for realtime movie VFX production a few times in person. Is the moiree for the person controlling the effects an issue?
Because up close I always found that somewhat disturbing / confusing

1 Like

For this one it’s not great, because the LED panels are 5mm pixel pitch and designed for a viewing distance of 10+ meters (section show is 1% of LED install going into an indoor theme park as wraparound screen all the way around central hub section).

So at the distance azure kinect camera works (max about depth about 6 meters, but motion picked up better around 3 meters) the viewing isn’t good for the screen. You can even see significant moire on the recording - not helped by my iphone 8 being pretty old and low res now. iPad Pro sensor, especially using wide FoV lens/zoom, gives much less moire in recording from same distances and closer.

However, with much finer pixel pitch LED panels it’s much better - e.g. IIRC these ones are around 2mm pixel pitch and designed/specced for viewing distances around 2 meters (used as a body tracked game install on a cruise ship, with LED floor in front so the visuals need to work up close):

There’s loads of interesting info about LED volumes for TV/Film these days - AFAIK they use similar pixel pitch (around 2mm) but the cameras filming are much higher resolution sensors and further away than 2m, so moire from pixel pitch not an issue, but panels need to be calibrated to the camera sensor response, not human eye. This is one of the best articles I’d read when ILM/Favreau/Mandalorian first started showing off virtual production: The Mandalorian: This Is the Way - The American Society of Cinematographers (en-US)

4 Likes

Yes camera calibration and a minimum distance of ~5m sounds familiar. The setup in that video though, wow that looks utterly magical :smiley:
finally feeling like a magician :stuck_out_tongue:

1 Like

Testing fish vs smaller swirly cubes:

1 Like

Not really vfx, but still fun - trying out Unreal Engine Lyra Starter Game with NDisplay headtracking and active stereo on the LED wall:

2 Likes

It’s been a minute, but I’m back on my bullshit again:

2 Likes

AAAAAAAAAAAAARRRRGH!

Messing about with HDRP in latest Unity alpha, trying to get my lighting, fogging, PPP values etc all in sensible ranges so I’m not constantly fighting with settings because 6 months ago I randomly made one of the lights twice as bright as the sun as a quick hack to catch the bloom.
Fed some audio into a vfxgraph:

2 Likes

Moody lighting version:

2 Likes

LED CAVE using OptiTrack camera rig for headtracking in Unreal nDisplay running Rural Australia environment. Have to capture it through the lens of active shutter glasses as it’s stereoscopic 3D, so video is only showing left eye view.
We didn’t realise there was a snake model in the brush there until we had it running in the CAVE - when it was just single powerwall 3D, never got to look at the ground :slight_smile:

Always fun to see what perspective based visuals look like from the wrong perspective. LED CAVE visuals look correct for the position of the tracked 3D glasses I’m wearing, but wrong as soon as that diverges from the recording camera’s PoV.

Testing some of the 2D logo vfx upconverted to more of a 3D treatment:

Went down a rabbit-hole tweaking this underwatery kelp type vfx:

1 Like