I’m pleased to announce the release of the Flock Vision Toolkit for Jitter. This small collection of computer vision objects were developed by Mark Godfrey, an MS student in music technology at Georgia Tech, for use in Flock, my recent work for saxophone quartet, dancers, audience participation, real-time notation, electroacoustic sound, and video.
* PARTICLEFILTER (mxj external) implements a (slightly simplified) version of the popular particle filtering tracking algorithm. Basically, particles of a given target sample the image. Those with high probability mass (i.e. over a target’s pixel) are more likely to be sampled in the next frame. In this way, a target’s particles tend to stick with it. We found this useful for tracking the saxophonists in Flock, since the tracker often had to deal with interference from other objects in the camera’s image.
* SKEW_CORRECTION uses a least-squared error transformation to correct for skew in an image, typically caused by the camera’s perspective. This is based on correcting a warped calibration rectangle in the image to a true rectangle.
* STITCHER finds a least-squared error transformation to warp one image into the space of another. This transform, in addition to a blending algorithm, can stitch images together, and it works well for panoramic images and multi-camera setups.
* LENS_CORRECTION corrects for "barrel/pin" distortion, commonly resulting from fisheye lenses.
We have tested these objects on Max versions 4.6 and 5.0 on both Mac and Windows.
You can download the package (which includes detailed help files) at: