Is there a way to "auto-chromakey" a real time camera feed?

Jan 12, 2010 at 8:07pm

Is there a way to "auto-chromakey" a real time camera feed?


This is my first post. I’ve been working with Max for about four months. I’m very new to it.

I’ve been working with jit.chromakey, and rather than a pre-recorded .mov loaded into it, I’m using jit.qt.grab along with a youtube channel so that the green or blue can be keyed out and whatever is chosen in youtube is put in. The color will be worn by the user, rather than be the background.

Right now, the way it is set up, the user must be at the computer in order to click the “suckah” object first in order to select which color to key, as well as choose their youtube video. I’m not so much as concerned about the youtube portion as I am about the user not having to key out the color.

I’m wondering which route I need to go. I’m relatively new to Max so I’m really not sure.

Any help anyone could give would be extremely helpful. Thank you!


I would also be happy to post the patch if that would be more helpful to visualize. Thanks again!

Jan 6, 2011 at 1:20am

Hey Stella -

I’m new to Max and just wondering if you ever solved this problem?
I’m just working my way through all the Jitters tutorials, and haven’t had any success yet chromakeying (using suckah) with a live video input over a firewire connection.
Once i figure that out, I, like you, will want to preset the key colour rather than need a user at the computer to click…

Would appreciate any help!

Jan 6, 2011 at 10:07am

Suckah just outputs an RGB list to define the colour you have clicked on, connect a print to it and you will see. In the case of Jitter Tutorial 10 on chromakeying it outputs ’64 108 255′ to describe the blue. Just have a message box with this list in and a loadbang to enter this automatically when the patch is opened.

Jan 7, 2011 at 10:02am

You can also tell [suckah] a screen coordinate and it will spit out the color at that pixel. So if you know where your preview jit.pwindow is, and approximately where the person will be standing in the frame, you can “auto-click” the [suckah] to grab the color you want. This way, you can have it sample the color at any time, use a delay before grabbing, etc.

Jan 11, 2011 at 10:00pm

Or if you want faster frame-rates, use opengl with a chromakey shader. This is what I do in my app ChromaKey Live. (source patch included in download here: )

This doesn’t solve your automatic shirt-color recognition problem, but the other responses provide nice advice.

One non-obvious technique that I used is to readback a temporary low-res matrix from the gl space to a jitter matrix when the user clicks their color on the preview window. This is my DIY replacement for suckah (which won’t work in gl)


You must be logged in to reply to this topic.