AV system that uses Kinect data to process video, and transforms it into OSC messages to control sound in an exterior application.
/ spring 2019.
Requirements:
- Max MSP Jitter with [dp.kinect2];
- Ableton Live or any other sound program that can receive OSC;
- Kinect SDK;
Hardware:
- Windows computer
- Kinect v2;
This project is part of Âmago||Superficie -- an AV performance developed during my time in NYU's Interactive Telecommunications Program (ITP) through the courses Live Image Processing and Performance, Music Interaction Design, New Interfaces for Musical Expression and Sensing Machines.
This project was transformed over a year, with some parts rewritten from scratch, or completely new parts added. Here is a timeline of the process, linking to its respective documentation: