Touchdesigner Experiments


Side Project
Full Production


I have been exploring TouchDesigner through a series of independent experiments driven by my interest in live and interactive installation art. I experimented with extracting and mapping the human body to create real-time visual responses to movement using MediaPipe. I also developed audio-reactive visuals and layered multiple video sources to test different ways of generating pixel sorting, distortion, and feedback effects. These studies focused on understanding how sensory data, such as sound and motion, can be translated into dynamic visual expressions within a live environment, expanding my approach to time-based and spatial media.









Created an experimental camera using live footage and pixelation effects, juxtaposing the subject’s real face with simplified geometric abstraction.
This work explored the minimal visual information required to maintain human recognition while blending realism and abstraction.









Inspired by the muted winter scenery outside a window, I designed visuals that respond to the sound of breath. As the breath grows stronger, the scene turns white as if frost fills the window; as it fades, the distorted landscape reappears. The entire process, from video distortion to tonal adjustment, was built in TouchDesigner, later combined with a live camera feed so the performer appears drawn onto a frosted surface. This experiment examined how sound data can modulate visual atmosphere and translate subtle emotional changes into environmental imagery.









Built an experimental camera scene where users can “draw” on the screen with their fingers, revealing a pixelated and distorted version of the live feed following their gestures. This experiment investigated the potential of gesture-based interfaces for real-time visual transformation.









As part of an editing experiment, created a sequence where pressing the keyboard key “1” switches scenes that are distorted through a feedback loop.
This work tested how simple input signals can simultaneously control transitions and generate evolving visual feedback.









Produced visuals based on datamoshed imagery that distort and recover according to the motion of an LFO graph, resembling the interference of a circuit-bent television screen. This piece explored the rhythmic aesthetics of digital noise and signal instability as a visual language.









Developed a custom drawing board where users’ finger-pinch positions are tracked, producing watercolor-like strokes as they move. This experiment focused on translating physical gestures into painterly, real-time visuals.

Other Works




© Nova(Jeonghyun) Park 2025. All rights reserved.