3D Stereoscopic real time anaglyph generator
Looking to build an on-set field monitor for shooting 3D stereoscopic movies. We’re fine viewing the 3D as an anaglyph (http://www.3dtv.at/knowhow/AnaglyphComparison_en.aspx), and the refresh frame rate doesn’t need to be full 60i or 24p. It’s just for monitoring the set up the cameras and the shots.
Disclaimer: Completely new to MaxMSPJitter and haven’t read the jitter tutorials, and am just on the Demo license for now. (I’m primarily an After Effects and Nuke artist.)
But I pick things up fast. I will get to the tutorials, but I learn coding best from relevant examples.
Searching these forums seems to indicate that a number of folks here have developed similar projects to combine two cameras into (something like) an anaglyph.
I would be honored and overjoyed if anyone could share some of their project code for such a task.
Thanks in advance,
Here’s a bare bones patch that demonstrates how to take the input from two cameras & combine as an anaglyph, complete with two presets to swap cameras.
Of course, you can add lots more stuff to reregister the two streams, change the size, saturation, etc, but this is the basic concept.
The other approach would be to venture into openGL territory, in which case you would map the two video streams onto a jit.gl.videoplane, & use jit.gl.sketch with openGL commands "glcolormask" and "glclear" to draw one steam into the red channel, then the other into the blue and green channels – not difficult, but if you’re just getting started, I’d recommend keeping it simple.
----------begin_max5_patcher---------- 1144.3oc2Y10abhCEF95Y9UXMWmMkiMeLrWTojahVoU6tJp8plpHFv6T5Blo fQI6V0+6KXimgoEH1CSbhpzDC1br487f8gCNec4hUaJdjVsB8qnOfVr3qKWr PzTaCK5puXUdziwYQUByVwnOTr4yqtPdIN8Qtn4rhnjbZUEBTWhUmWTyynbQ +ftV2Ewi+TJa68kzXt79R7tz4Bjnv2osDBtzA8wtNjlHF+l64u3pF5+tfwYQ 4TwU9853zjHzMkQrDpxB4cl+u6nx6wpU6Gv1NWk9ehKD1b61q1T1dw1112Vt rs3BM4RbQdNkw+AvT8PzNzUu45gwhy3XAKXAVxEew4vvbg.5Cl9dOfe9c++7 8u6ud+6dq4duvuAPbHfHbe7vtON7Dcexyu6+a+wI48x0AAhRuvIc90udc9qt ic6cratisexeVJiFWTyDW20TpDJlLHKwth0EqGgJAuhoBftEP2.nqAzUXzsX zMXz0Ximi.x.DAxEJDQDBXLd3eh7v47wickzJJeX2z8od6fqnjDL4RA2Qi9q t2nUoLwgiaX+nsodylL5g3iC56JsJFh6Sh3Qp2et2yk8YCsrWGZZ6fsfSiiH UMoUE4Q7xzGi4kYs0xRqZ0li3Gzqr06UmPNtcn2OrpExwcAqrGqZDqrgzaXA 0IX0Ixq5prwUYiqxFWkMdJa7T13orwSYiuxFekM9Js4qrIPYSfxl.03DHEI5 iRrJmANL5wyC8NlfdGSPOXB5AE5ASPOXB5cLA8NGPu7LCiDzm3CEM.+jA8B6 B54nRNZv3AjQiGz8TVdbeeKXz2TV7P+E4Q07h8gC5uvmwi3oEr9RSpoQUV6n GWjUmy5OXyNr5H4iW13yzRzZj6vL1ahTujAaAY5Gfp1HPFOmjxu369yjjzCO ev5yo7KeHkkz7r2zIjcvBCxb0ljUf+LX0lH1ViYDbdYztn3+4TIDDJmN0jhk 2DHBlAhZj38xXKFCJ2yKnpYiipIV4QH8WxI+pmQIkyYiT5W6kbx2IxTru9LM 7mQj1tqLQao+.SK1QYluWMcyPcDGHXwZYugo4ZascM3WZpfWqOUBdMSEdw1s YTy8e4ZL4mp0suUNC69di5bG8gXOWAR9RNkWVzntS0Gkkd9SEFgL2Wy+b+Xd hnrege41xnMF+VdUfA49144NYNiuLuiGdcPJ0KjzgTvKOoD8Srwce2+l.gq0 19w3qpntLVogtcsGcv4RnU7Tl3am5YC9Xi9TZRBk0e+uxSS1UzDgnSCi7Ymm SI4YlhbcaC5ikaTsenpxLEpmFBkXUz0lf9SJo1OzwhZBSzYJF1LMAXwCQv2S 7rzycesYp1PcEKXM.FdzcaBIgspjv5HIhUkDQGI4ZuUiN5NYxytZRqYS91KB AVmnVfciZg0fSBMA1USXczD1tZhnilH1cNtVgmBrqlzJ9zZ68rSqj+HmRpVg fLK5PUk4hOMTZnUCPDnSLKyTDdsLoFLHRpw+PsYpUx4WqycpmNJxwpRZsNKF rphHmeE0eRUS7HxHSwZp7sk+OM8fuSC -----------end_max5_patcher-----------
I need to build a stereographic 3D player that takes two 720i Streams with audio and converts them dynamically into a single anaglyph image with color balancing.
The algorithm to create the anaglyph target image is that the target Red Channel is created by processing the RGB values of the Left Eye Source Image through the following equation:
Anaglyph.Red = (0.3 * LeftEye.Red)+(0.6 * LeftEye.Green) + (0.1 * Left.Blue)
Anaglyph.Green and Anaglyph.Blue are just the right eye’s Green and Blue without any modification.
We’d need this to be displayable full-screen with the standard pause, stop, play, rewind, etc. controls, and have a simple "scripting" protocol (probably XML, though we’re open to other alternatives) that would allow us to tell the player at frame xxx, turn off the color balancing (as described above), display just the left or right eye, turn on the color balancing, etc.
It seems like this should be really simple for Jitter to do based on the demos, but I don’t have the time to get up to speed to do this right.
I’d be willing to pay someone to create this player for us which needs to be released on both Mac and Windows.
you are right, it can be done. please contact me at concept [a] maybites [d] ch
did you get any response on this? I can do this in realtime on a laptop, but i am looking for an ‘all-in-one’ ultraportable field monitor, not a computer. need some kind of portable muxing box to blend the two camera feeds…
Forums > Jitter