Over the past six months we have been working on our master thesis project, experimenting with mixed reality gaming in large scale head-tracked environments. This is based on Max/Jitter and Unity3d. You can check out a demo of two prototype games here: http://vimeo.com/14723732
For the head-tracking we have developed a number of new c-objects for max. They enable us to track infrared LEDs worn on the players stereoscopic glasses. The tracking uses Unibrain Fire-i cameras and covers an area of about 50 sq meters. The seven new objects are all related to camera tracking in absolute coordinates, and four are based on functionality from OpenCV. We hope some of you will find them useful. Here is an overview:
sl.camintrinsics – internal camera calibration
sl.jit.camextrinsics – external camera calibration
sl.jit.undist – undistort matrix using internal properties
sl.undistpoint – undistort pixel coordinates using internal properties
sl.transform – transform coordinates to another coordinate system
sl.vintersect – position markers in 3d space using vector intersection
sl.players – group and track position of player markers
Using these objects and cv.jit we have created a complete Max/Jitter tracking application named Xyzled. It is tailored to our specific purpose but could be used for similar stuff, or just to check out how the objects can be combined or used separately. The four OpenCV objects requires the OpenCV framework to be installed to work.
With respect to the four OpenCV based objects, you can check out the wiki for install-guides and in-depth details at http://opencv.willowgarage.com/wiki/Welcome. On the list above it’s the first four that are based on OpenCV.
We also look to release the source code as soon as we get down to cleaning it up.
Please let us know if you have any comments, questions or proposals. This is pretty much raw and uncut out of our own process, and we will be documenting and streamlining the patch and objects in the coming months.
-Lasse Knud Damgaard & Sune Hedegaard Hede