Thesis project: 3D absolute position multiple LED tracking via Max + externals.
Over the past six months we have been working on our master thesis project, experimenting with mixed reality gaming in large scale head-tracked environments. This is based on Max/Jitter and Unity3d. You can check out a demo of two prototype games here: http://vimeo.com/14723732 and http://vimeo.com/14722990.
For the head-tracking we have developed a number of new c-objects for max. They enable us to track infrared LEDs worn on the players stereoscopic glasses. The tracking uses Unibrain Fire-i cameras and covers an area of about 50 sq meters. The seven new objects are all related to camera tracking in absolute coordinates, and four are based on functionality from OpenCV. We hope some of you will find them useful. Here is an overview:
sl.camintrinsics – internal camera calibration
sl.jit.camextrinsics – external camera calibration
sl.jit.undist – undistort matrix using internal properties
sl.undistpoint – undistort pixel coordinates using internal properties
sl.transform – transform coordinates to another coordinate system
sl.vintersect – position markers in 3d space using vector intersection
sl.players – group and track position of player markers
Using these objects and cv.jit we have created a complete Max/Jitter tracking application named Xyzled. It is tailored to our specific purpose but could be used for similar stuff, or just to check out how the objects can be combined or used separately. The four OpenCV objects requires the OpenCV framework to be installed to work.
Documentation is sketchy at the moment, and there are ofcourse a number of things we would like to improve on. However, in the meantime, you can check out the prototype goodies here: http://dl.dropbox.com/u/2465463/slpack_and_xyzled.zip
With respect to the four OpenCV based objects, you can check out the wiki for install-guides and in-depth details at http://opencv.willowgarage.com/wiki/Welcome. On the list above it’s the first four that are based on OpenCV.
We also look to release the source code as soon as we get down to cleaning it up.
Please let us know if you have any comments, questions or proposals. This is pretty much raw and uncut out of our own process, and we will be documenting and streamlining the patch and objects in the coming months.
-Lasse Knud Damgaard & Sune Hedegaard Hede
hi, there, i am trying to do something similar for my undergrad; I want a opengl object to respond to my head turning. Ive been working on a mac but i have a pc. be looking forward from hearing how you are getting on, your display looks awesome!
Any news about your project, did you manage to release or improve the code ?