and install everything you need to get OSC data from your Kinect.
Here are a couple of patches I created that use the tracking data from the kinect... The Color Zones patch assigns every user a color, and creates three zones, if two users are in the same zone, their colors mix.
The Audio Collage patch assigns each zone a sound, and as the user moves around that zone, they change the pitch of that sound. If they enter a different zone, they trigger that sound.
You get a lot more data from the device if you do the cactus pose, but I have never had any interest in making a user do that... but its in the data, and you can process it in a similar way to the way I processed the full body tracking data...