Hi – I’m wondering how I might pull the real-time contents of a separate mac application window into Jitter. I have a dual window setup for a performance I’m working on.
On one screen is a compiled openframeworks app that is playing videos / using my camera. This is being projected on to a large screen. The second screen is my laptop screen, on which I have my max patch and can preview what is happening on the projector and augment the video feed using things like chromakey.
I would like to display what the openframeworks app is doing within jitter, and potentially add some effects like chroma key to that feed. Is this possible in Max? Can anyone point me in the direction of the right functions to run? I was thinking of starting with something like jit.gl.videoplane or jitter pwindow, or some sort of open GL operation to capture the window contents straight from the graphics.
Can you tell me how did you exactly pulled the video into Max (I have syphon and the jit objects already, but I haven’t been able to make it work). I’m trying to get the video from any kind of broadcasting app (VLC, QuicktimeBroadcaster, even Skype…) into Max.