Camera Tracking and Translation

    May 29 2017 | 6:43 am
    Hi All, Im fairly new to Max/Jitter, and have an ambitious project in mind, which I am hoping to get some advice on/assistance making..
    Essentially, I want to take a live feed from a camera, track a subject, convert their motion into X, Y and Z information, and then convert each of those attributes into a MIDI CC output, which will be used to control some midi mappable parameters in other applications.
    Ideally, I would actually be able to track multipule subjects, independently thorugh multiple camera inputs, and then translate and send all XYZ data independently as well.
    Any help with this would be greatly appreciated.

    • May 29 2017 | 12:18 pm
      If the people are within 4 meters, the Kinect v1 or v2 sensor is your solution. It was designed to do exactly what you want and it is easy. You can download the dp.kinect2 trial at
      In the included help file, I have an example of creating midi data from the x,y,z coordinates. It is on the "midi" tab of the help file.
    • May 29 2017 | 3:34 pm
      That is great news! I have a Kinect V2, but I am operating on a Mac. Can you advise what hardware I need to connect the USB 3 Type-B from the Kinect v2 to my Mac? All adapters I can find seem to be for PC only.
    • May 29 2017 | 3:55 pm
      I have a Kinect (v1) and (v2) and operate on a Mac.
      I cant remember what version I had to order, but my Kinect v2 plugs straight into my USB. (anybody know the trick?)
      To use the v2 on a Mac you will need to use Ni-Mate. Here's the bad news, due to the Mac being only usb2 (lower bandwidth) and other licensing issues I dont really get, the v2 and Ni-Mate is really only usable to get OSC and Midi out, so you could get positioning and such, but not any of the images (RGB / IR) (at least in my computer/horsepower circumstances) - The images being what I need for my work more - I use the V1 kinect muuuuchch more, since I can use Ni-Mate OR the homebrew patches I have built using Max and the freenect patch
      Having said all that - Right now is kind of a limbo with Kinect V2 and Mac...
      Try Ni-Mate and see if it works for what you need.
      If you can get over to a PC then DiabloDales Kinect patches are "**extremely**" well thought out and mature, and a lot of the work is already done for you... You just gotta be willing to dabble in "the other side" I very likely will get a PC in my future simply for using my V2.
      Hope this helps inform you any!
    • May 29 2017 | 6:37 pm
      Thank you so much for all this insight. I will get to work on all of this shortly.
      I also have a kinect v1, so perhaps I will start there. Also, my mac has USB 3.0, which may be helpful.
      Out of curiosity, do you have a sense if this adapter would work with the setup you are suggesting with the Kinect V2?
    • May 29 2017 | 10:50 pm
      Neptune, the adapter you linked allows a Kinect v2 sensor to plug into the USB3 port of a computer or an XBox One S. This adapter changes the sensor's proprietary connector into a standard USB3 connector. I use this same adapter to connect my Kinect v2 sensor to my Windows laptop. Another option is to have a separate Windows PC to use the Kinect sensor and pre-process the data...then send that processed data to another computer to do audio and/or video. I see this approach often with room-sized installations. As yet another option, I have heard rumor that some people with new Macs use bootcamp and run Windows and the Kinect that way. Two crucial parts are the graphics card and the USB3 controller. They both need to be recent so that they support DirectX 11 (compute shader 5.0) and USB3 on PCIe-Gen2. Microsoft has a hardware verifier at the same link above which can test your hardware.