Photographer & Director specializing in Beauty, Still Life and Multi Media.
2024
Synthetic Fluidity 2
Experimentation
Tags
Ai, Beauty, Texture, Still Photography
Filming liquids is a massive technical challenge because it’s impossible to get the same splash twice. Every take is a one-off event. This project started with a simple question: In a world full of digital loops, does that "one-of-a-kind" randomness still feel special, or does it fade once you've seen it a few times?
The Experiment
I wanted to see if I could use AI to "re-skin" the movement of water without losing its natural chaos. Liquids are the perfect test for this because they are unpredictable—exactly what AI usually struggles with.
The Workflow
To bridge the gap between real-world physics and AI, I built a specific pipeline in ComfyUI:
The Base: I used a video of real liquid to provide the "bones" of the movement.
The Depth Map: I applied a Heavy ControlNet Depth filter. This forced the AI to follow the exact 3D shape and volume of the original splash so it didn't just turn into a blurry mess.
The Style: I used AI-generated stills as IP-Adapters to act as a "skin." This allowed me to change the texture and color of the liquid while keeping the physical movement of the water.
The Result: An AnimateDiff sequence that feels physically real but looks synthetically impossible.
The Bottom Line
This study is about control. By combining a real depth-map with AI textures, I found a way to maintain the "craftsmanship" of a high-speed camera shoot while using a neural network to push the visuals into a space that nature can't reach.













