Generative Audiovisual Experiment | AI by Mariano ZhaoGenerative Audiovisual Experiment | AI by Mariano Zhao

Generative Audiovisual Experiment | AI

Mariano Zhao

Mariano Zhao

Generative Audiovisual Experiment | AI + TouchDesigner
Hey guys! I’m excited to share my latest project where I explored the synergy between Generative AI and real-time visual programming.
I used AceStep1.5 and ComfyUI to generate the initial AI music and video assets, then integrated them into TouchDesigner for the final production. Technically, I leveraged the NVIDIA Background TOP and Face Track TOP for seamless real-time matting and tracking.
The visual distortion is entirely driven by the music’s DNA—splitting the audio into bands (Kick, Snare, and spectral ranges) to modulate the intensity of the displacement. This project is a testament to how AI-generated content can be "re-conducted" through real-time creative coding.
🔊 Sound on for the full experience!
Like this project

Posted Mar 6, 2026

Generative Audiovisual Experiment | AI + TouchDesigner Hey guys! I’m excited to share my latest project where I explored the synergy between Generative AI an...