In autumn 2021, I explored a series of audiovisual experiments focused on interactive and generative installation content.
The project resulted in around 20 pieces, each examining different interaction methods — including voice input, motion tracking, and audio-driven synchronization.
The setup was built around three workstations:

  • MSI Creator for real-time visuals in Notch VFX
  • A laptop for audio generation in Ableton Live
  • An HP NUC running TouchDesigner, handling input (motion, voice) and system integration

All systems were connected via the OSC protocol
Audiovisual Content

Beat-driven visuals with fluid simulations controlled in real time via faders.

TouchDesigner / Notch VFX / Ableton Live / Arturia Beatstep
Audiovisual Content

Both audio and generative systems respond to hand motion.

TouchDesigner /
Notch VFX / Ableton Live

LeapMotion
Audiovisual Content

Fragments and particles react dynamically to hand gestures.

TouchDesigner /
Notch VFX / Ableton Live

LeapMotion
Generative Content

Fragments and particles react dynamically to hand gestures.

TouchDesigner /
Notch VFX

LeapMotion
Audiovisual Content

Spheres disperse through motion input.

TouchDesigner /
Notch VFX / Ableton Live

LeapMotion
Audiovisual Content

Audio and smoke are driven by gesture interaction, with integrated typography in the scene.

TouchDesigner /

Notch VFX / Ableton Live
LeapMotion
Audiovisual Content

Generative visuals and sound are performed live using faders and pads.

TouchDesigner /
Notch VFX / Ableton Live

Arturia Beatstep
Audiovisual Content

Generative visuals and sound are performed live using faders and pads.

TouchDesigner /
Notch VFX / Ableton Live

Arturia Beatstep