I’ve been working on a gesture-based user interface that uses raw video from an iSight for input. I’ve already explored a lot of avenues for capturing hand gestures and with image-analysis algorithms, but it simply sucks up too much CPU. I’m trying now to do my analysis in the domain of (relatively) simple and fast math. I’ve got Jitter spitting out the coordinates for the center of the user’s hand (image centroid, actually, but close enough). I’d like to track the motion of this centroid over an interval, then determine whether the captured values fit a prototype gesture. Obviously, it won’t exactly match, so this is a best-fit-curve problem. Anybody have any suggestions for making this work in Max?