I’m thinking about designing a virtual instrument using the kinect. I’m a novice at this so i was wondering if anyone had any patches that used the kinect that i could have a look at?
It would be a huge help.
and install everything you need to get OSC data from your Kinect.
Here are a couple of patches I created that use the tracking data from the kinect… The Color Zones patch assigns every user a color, and creates three zones, if two users are in the same zone, their colors mix.
The Audio Collage patch assigns each zone a sound, and as the user moves around that zone, they change the pitch of that sound. If they enter a different zone, they trigger that sound.
You get a lot more data from the device if you do the cactus pose, but I have never had any interest in making a user do that… but its in the data, and you can process it in a similar way to the way I processed the full body tracking data…