Real Time Video Projection Problem

Feb 6, 2010 at 5:44pm

Real Time Video Projection Problem

I am thinking of using real-time video tracking and projection to present a piece in which the dancer’s movement is analyzed by the computer in real-time and some animated stuff will be projected on the screen behind the the dancer. However, I noticed that doing real-time video tracking and projection toward the same direction will create a “video feedback loop”. The projected animation may also affect the video tracking result.

This youtube video show something I can’t imagine.

http://www.youtube.com/watch?v=pS1WALmBqUw

They said they use infrared camera, but I think the infrared camera only helps make something under darkness more clear. Does it also differentiate real people from projected stuff ?

What are the tips for real-time video tracking and projection ? Is there any way to let the computer system only analyze the body movement of real people (without sensors) and ignore the projected stuff ?

Thanks.

#48282
Feb 6, 2010 at 5:58pm

i think it works with body heat. the camera records only the body heat the dancer emits and since the projection doesn’t produce any heat it doesn’t show.
i’ve seen chunky move in berlin once and they are quite amazing. unfortunately i didn’t have a look on the screen of the visuals guy but…

#173628
Feb 6, 2010 at 6:07pm

actually its not heat; its infrared reflection to a camera that has an infrared filter on it. they only track in that spectrum and they use a frame differencing technique so the lighting and projection dont interfere with the tracking. two big infrared emitters overhead. this is actually the second time in three days that I have explained this on this forum.. Frieder Weiss is brilliant. They have been using this method for like 5 years. I helped them set up the system for their show in new york. And they dont use jitter for the tracking, they only use max to drive the laser. The tracking is done in Frieder’s software called Kalypso. Its written in Delphi and he sells it on his website.

http://www.frieder-weiss.de/

#173629
Feb 6, 2010 at 6:19pm

sorry I just realized he took the info for Kalypso down from the website. He has his older software on there called Eyecon. Kalypso is still available if you contact him though. Also there is a wealth of info on there about the tracking and what types of cameras and filters he uses.

#173630
Feb 7, 2010 at 1:26am

Hi, Piwolf,
I have read the article you mentioned to me, and also tried the frame difference patch (but with regular camera).

I thought the screen itself also reflects the infrared light, so the animation projected on the screen should also affect frame difference result. I have not yet had a chance to try infrared camera, but I could not figure out why the animation on the screen will not affect frame difference. Did they use some special screen that can absorbe most of the infrared light ? Could you explain more ?

Thanks.

#173631
Feb 7, 2010 at 2:40am

when you put filters on the camera lens and use contrast and threshold to dial in the tracking mask you can get a mask that isnt affected by the projection. you wont be able to do this with a normal camera because it is tracking in the same color space as the projection, our human eye space. I am sure they tried different body suits to find something that was way more reflective to infrared than the plain white floor. Also, the floor does still reflect some infrared but it has a slightly different color signature from the reflection of the unitard and its enough to differentiate well. Its not a system that was assembled in a day, to get elegant results there is a fair amount of front end tuning time on the cameras not to mention some data smoothing on the tracking itself before its used to generate animation. He has used that method for 6 years or more and even though it went very smoothly it still took us a full 8 hour day to set the show up and focus cameras and projection for the space. Its a solid undertaking to try and execute it well. I usually use video content rather than generated animation and that takes some programming time out of it for me but it depends on what type of coding you normally do. If you have a bunch of shader patches laying around they could easily be used with the tracking values or if you have a bunch a video effects built then run the mask as alpha over your content.

#173632
Feb 7, 2010 at 4:46pm

Hi, Piwolf,

Thanks for the info and suggestions. I still have several questions:

1. There are several IR filters with different wavelength (720, 850, 950 etc). What best suits my project ?

2. I am thinking of buying a surveillance camera for the project, but the filter size available on the market does not fit. Is there any other good way to attach the filter to the surveillance camera ?

Thank you so much for the help.

#173633

You must be logged in to reply to this topic.