3D Stereoscopic real time anaglyph generator

    Nov 17 2009 | 1:34 am
    Hi all, Looking to build an on-set field monitor for shooting 3D stereoscopic movies. We're fine viewing the 3D as an anaglyph (http://www.3dtv.at/knowhow/AnaglyphComparison_en.aspx), and the refresh frame rate doesn't need to be full 60i or 24p. It's just for monitoring the set up the cameras and the shots.
    Disclaimer: Completely new to MaxMSPJitter and haven't read the jitter tutorials, and am just on the Demo license for now. (I'm primarily an After Effects and Nuke artist.) But I pick things up fast. I will get to the tutorials, but I learn coding best from relevant examples.
    Searching these forums seems to indicate that a number of folks here have developed similar projects to combine two cameras into (something like) an anaglyph.
    I would be honored and overjoyed if anyone could share some of their project code for such a task.
    Thanks in advance, John

    • Nov 17 2009 | 10:55 pm
      Here's a bare bones patch that demonstrates how to take the input from two cameras & combine as an anaglyph, complete with two presets to swap cameras.
      Of course, you can add lots more stuff to reregister the two streams, change the size, saturation, etc, but this is the basic concept.
      The other approach would be to venture into openGL territory, in which case you would map the two video streams onto a jit.gl.videoplane, & use jit.gl.sketch with openGL commands "glcolormask" and "glclear" to draw one steam into the red channel, then the other into the blue and green channels - not difficult, but if you're just getting started, I'd recommend keeping it simple.
    • Mar 29 2010 | 6:40 am
      I need to build a stereographic 3D player that takes two 720i Streams with audio and converts them dynamically into a single anaglyph image with color balancing.
      The algorithm to create the anaglyph target image is that the target Red Channel is created by processing the RGB values of the Left Eye Source Image through the following equation:
      Anaglyph.Red = (0.3 * LeftEye.Red)+(0.6 * LeftEye.Green) + (0.1 * Left.Blue)
      Anaglyph.Green and Anaglyph.Blue are just the right eye's Green and Blue without any modification.
      We'd need this to be displayable full-screen with the standard pause, stop, play, rewind, etc. controls, and have a simple "scripting" protocol (probably XML, though we're open to other alternatives) that would allow us to tell the player at frame xxx, turn off the color balancing (as described above), display just the left or right eye, turn on the color balancing, etc.
      It seems like this should be really simple for Jitter to do based on the demos, but I don't have the time to get up to speed to do this right.
      I'd be willing to pay someone to create this player for us which needs to be released on both Mac and Windows.
    • Mar 29 2010 | 7:25 pm
      you are right, it can be done. please contact me at concept [a] maybites [d] ch
    • May 31 2010 | 8:23 pm
      did you get any response on this? I can do this in realtime on a laptop, but i am looking for an 'all-in-one' ultraportable field monitor, not a computer. need some kind of portable muxing box to blend the two camera feeds... thanks