Hi everyone! I’ve been part of this community for a long time as a silent observer, but today I’m excited to share a project I’ve been developing to solve some fundamental issues with texture data. It started as a personal struggle with energy loss in VFX masks, but evolved into a universal engine-agnostic protocol for data decoupling and reconstruction.
The Genesis: Fighting the Rec.709 Legacy and the “Blue Vein”
It started with a frustrating technical artifact in a fire VFX. Every time I converted a high-quality flame asset into a standard grayscale mask, the “energy” vanished. Standard color-to-grayscale conversion relies on the legacy Rec.709 formula (0.21R + 0.72G + 0.07B), designed for 1950s television luminosity perception, not modern VFX.
This formula treats bright blue and red channels as dark areas, creating a “dirty” look and a “blue vein” where the peak energetic part of the plasma should be. I realized that we are using CRT-era math to drive modern PBR and AI pipelines.
The Core Concept: Data Disentanglement (Topology vs. State Atlas)
Instead of merging all data into one lossy channel, Omni-Extractor decouples an image into two independent, high-precision layers:
- Structural Topology: A 16-bit map (I;16) that stores spatial logic as a “presence index”. With 65,536 steps of precision, it acts as a high-fidelity coordinate map for signal reconstruction.
- State Atlas: A 1D-palette (Ramp) extracted via K-Means clustering. It holds the optimized “states” of the asset—whether it’s color, PBR properties, or vectors.
Balancing between extreme compression and near-lossless visual fidelity.
Universal Applicability: Engine and Platform Agnostic
The technology itself is entirely engine-agnostic. The reconstruction logic is incredibly simple and can be implemented in any shader language:
// High-precision reconstruction logic
float latentCoord = TopologyMap.Sample(MapSampler, UV).r;
float3 finalSignal = StateAtlas.Sample(AtlasSampler, float2(latentCoord, 0.5)).rgb;
return pow(finalSignal, 2.2); // Gamma correction
- Hardware-Native Reconstruction: The process leverages the GPU’s Texture Management Unit (TMU). By using the 16-bit topology as a UV coordinate, the hardware performs native bilinear interpolation between atlas centroids, ensuring infinite smoothness.
- Deployment Anywhere: Ready for PC, Consoles, Mobile, and VR/AR hardware.
Beyond Visuals: Normal Maps and PBR Encoding
We can encode Normal maps and other physical parameters into this protocol. This allows us to “re-skin” an object’s surface properties as easily as swapping a State Atlas.
The Efficiency Imperative: VRAM Liberation
- VRAM Optimization: Transitioning from 32-bit RGBA textures to decoupled 16-bit structural maps can reduce texture memory footprints by 70-80%.
- Hybrid Export: The pipeline enforces 16-bit precision for masks while keeping atlases in 8-bit RGB, optimizing both bandwidth and storage.
Technical Constraints & Transparency
To maintain absolute 16-bit integrity of the topology, this protocol does not store Alpha data within the primary map. Transparency should be handled as a separate stream or derived mathematically in the shader.
Open Source
The project is now open-source. You can grab the Example_Workflow.json and start experimenting with this new paradigm of information preservation.
GitHub Link: https://github.com/ninpo3d/ComfyUI_omni_extractor










