Hello. I have a technical question about motion tracking that I'll like to 'share' in this forum.
I'm working in an interactive installation where i would like visitors to interact with a video projection. The idea in fact, is to make an interactive floor (something like 'the lighting district' from time's up).
The tracking system is a fire-i with an IR filter, and a video projector, both with fish eye, mounted in the ceiling, and 'pointing' to the floor.
To simplify the understanding of the interaction, we can imagine that the installation works like this: the video projection creates an image of a grid in the floor, when people enter each rectangle; the motion tracking system detects it, and gives an audiovisual feedback...
My question is: What is the smartest way to conceive and program an 'auto calibration system'?.
In other words: Who can the fire-i recognize the squares, or interactive zones, that are projected? (Don't forget, the fire-i is only sensible to IR).
Or, how can the motion tracking zones automatically correspond to the video projection zones?.
The installation is going to be permanent, so i need the system to recalibrate itself (The different motion tracking zones must fit in the image grid).
I thought of different solutions. For example using reflective 3M tape to mark the floor according to a fixed video projection area, so that the IR camera can recognize it and adjust the motion tracking grid.
Or maybe i could use another fire-i without an IR filter, to do the calibration work.
I hope my question is clear, and that it will contribute the discussions about motion tracking in the forum (I haven't found any info about solutions to this specific situation).