i’d like to realize a max msp project where i track a person via kinect and synapse and deliver the data into max.
tracking from synapse already works but i don’t know how to work with those data in max.
my goal would be that if you hit the different corners of the screen, max plays a different sound in each corner.
unfortunately i have no idea how to go on in this project. could anyone please help me?
If you are on Windows, you can use the native dp.kinect.
If on Mac, you can use also the native jit.openni.
Both send/receive native Max messages which you can use like any other native Max object.
Its good to have options. ;-)