Tracking a person in real time - best methods/techniques

    Oct 29 2013 | 6:02 pm
    I'm working on a project that will involve tracking a dancers position and projecting video onto them in real time. I'm using an IR-sensitive camera so as to avoid interference between the input and the output video.
    Does anyone have experience working with position tracking like this in Jitter? What techniques/methods have you found to be most efficient and most accurate.
    My first thought was to make use of the cv.jit packs, using the cv.jit.blobs objects to track the center position of my subject. This is pretty great, and pretty accurate, but it means that any extremities (ie. arms/legs reaching outwards) won't have a direct effect on the regions they are touching, only on the overall blob count/size. (Cv.jit.blobs also takes up some considerable computing power.)
    Now I've been working with simpler methods of masking and limiting using a mix of jit.op's, background subtraction, jit.scissors to define submatrices, and frame differencing to detect motion. This is great, and seems to be much more CPU efficient, but I still can get the level of detail I'm interested in.
    I've searched around the forums and see some references to using openGL shaders to handle this, although I'm totally unschooled in writing the code for this.
    Any thoughts? Suggestions? Search-terms to look up? Thanks.

    • Oct 29 2013 | 6:08 pm
      I'm interested in using this in conjunction with a "particle" system that's being run by jit.gen and objects. I'm looking to have the position of the person in the space affect the particles by either pushing them away or pulling them closer. Something which I've already accomplished if the person is only a single coordinate-pair (x,y). How can I get a more detailed view of the person, including arms/legs/etc. to directly interact with this drawing?