Goopy Mirror
Interactive Projection Exhibit
2025
CHALLENGE
Utilize live Kinect Azure camera data as an input for a creative and attention-grabbing interactive experience that is projected onto a hallway wall.
Concept
The brief made it clear from the start that this project would need to incorporate not only TouchDesigner's interactive capabilities, but also live skeleton camera data from a Kinect Azure installed at the exhibition site. Besides these requirements, the sky was the limit: I had total control over the concept and creative process. I used this to my advantage and set myself up with ideas that would push my TouchDesigner training to a higher level and make me even more familiar with the software.
What I eventually moved forward with was a concept that primarily hinged on TD's native metaballs as the main visual element. This would be a "goop mirror" that spawns a gelatinous creature (I call him Goopy) on the projected image whenever a person walks by the camera, and Goopy would mimic the person's overall body movements in a bouncy way. As soon as the person leaves, Goopy disappears and morphs into a nondescript blob of goop. If someone comes back in front of the camera, so does Goopy, changing his appearance every time he spawns for a user.
Pitch Deck
Technical Execution
The positions of Goopy's metaballs are tracked to several XYZ coordinates from the Kinect Azure camera's skeletal data. The camera data only includes points for a user's chest and hands, which limited Goopy's design to having just a torso and upper limbs that would react to these points' coordinates. Then, the skeletal data's real-time motion values are slightly dampened and delayed to create a strong follow-through effect on Goopy's movements, making him feel extra bouncy.

The image below is a screenshot of the Base COMP in TouchDesigner that contains all of Goopy's metaballs. His material is added later on downstream in the operator chain outside this COMP. I originally planned to have several different Base COMPs for different metaball arrangements in order to have multiple Goopy designs, but the amount of time I would've needed to manually arrange and test various metaball combinations made this unfeasible for this project's timeframe. However, I set up the file with this idea in mind in the first place, which makes it easy for me to implement this concept in the future if I'd like to!
Goopy's Base COMP is just one part of the main operator chain that makes up the file. The view below is where all the other elements of the project come into play, such as the Kinect Azure data, audio, and the background. Goopy's material is applied as a Phong MAT that receives color data from multiple Constant TOPs that all pass through a switch, which activates every time a user leaves the frame. This mechanism causes Goopy to change color every time the trigger happens.
File Walkthrough
Peer Assistance
Throughout this project, a couple of classmates greatly helped me with testing my project and spotting glitches or awkward hiccups in different aspects of my operator setup, as well as offering ideas on how to fix or improve certain details.
Nathan Kipka: suggested several technical workarounds and tricks for my operator setup to remove glitches

Casper Chappell: main featured tester in the video, tested the project throughout several initial passes

Japey Howarth: informed me about the potential use of a "feedback" process in TouchDesigner that I considered using, also helped with testing my project in the hallway
Takeaways
I truly enjoy working with TouchDesigner, and this project further bolstered my love for the software. I still ran into a few awkward glitches simply due to the Kinect Azure camera's way of interpreting the live data, and unfortunately I couldn't manipulate that directly. Still, I worked through these challenges and took these limitations into account in order to create a decent, non-distracting sense of interactivity.

Moving forward, I'm very interested in experimenting with different kinds of inputs as triggers for other interactive experiences, such as audio-based or image-based information. I'm blown away by TouchDesigner's adaptability and capacity to work with anything that can be interpreted as data on a computer, and my imagination would definitely go wild thinking of different things to use as inputs!
check out my other work!