Splitting ARGB planes

    Apr 12 2010 | 5:11 am
    I'm new to Max and am working on my first project. I'd like to take a live webcam feed and split the ARGB so that I can view each plane individually in individual jit.pwindows within the patch. From there, the dominant colour will generate tones. The problem is that I cant seem to split the ARGB channels after I get the webcam feed in. I've applied jit.matrix to the webcam feed, then unpacked it, and then I've tried calling the individual planes using jit.matrix object again, which doesnt seem to work, and I've also tried using jit.findbounds, which also doesnt seem to work for me.
    I dont know how to post the max patch within my post here, so I've attached it as a separate doc. If anyone could help, that would be great.

    • Apr 12 2010 | 12:08 pm
      The way it is now, I'd remove the matrix (looks like you're resizing to a one-dimensional matrix with 4 planes and a width of 16). Then, after jit.unpack, you should have 4 separate planes. I've never used findbounds.
      You can post using Copy Compressed from the Edit menu, then just paste into the browser.
    • Apr 12 2010 | 2:29 pm
      hey there...I've actually made something similar if not exactly based on what you are trying to make..check it out: http://vimeo.com/953967
      It does some math on the different r g b values and determines which color is there..but I decided to go super simple and stick with rgb..doing intermediate colors (purple, yellow, etc) would require much more accuracy with both camera white balance settings and other calibration..my installation frequently falls out of calibration with even a slight color change. I split my image into 8 'lanes' so that each area triggers a different note..blue is low notes, green is middle and red is high
      you'll want to split everything into the 4 planes like you already have..then send all of those into separate jit.3m objects and probably use the 'mean' output to get the average color coming in..then stare at the numbers you're getting out of that
      if you wanted to re-pack the color planes..you can just use jit.pack..but if you havent quite gotten unpacking down..thats just jit.unpack
    • Apr 13 2010 | 2:41 am
      laserpilot: when you speak of these lanes, are you talking about jit.matrix? I'm not quite understanding the lanes...
      I managed to get the RBG values split, did some math on the values but not sure that my scaling is correct. There might be a flaw in my math, possibly my logic! Anyone wanna take a peek?
    • Apr 13 2010 | 3:29 am
      By lanes I mean I segmented my video so that I could figure out the dominant color on smaller segments as opposed to just the overall. using jit.scissors and jit.glue
      I don't understand why you have those zmaps there...they just convert one way..and then convert right back? Seems really unnecessary. I've never actually used zmap..I'm so used to scale..but i guess zmap is nice for lists.
      I'm also not sure why you're adding together the min, max and mean values for all of the colors. On the one hand I can't think through how it would hurt..but on the other hand I can't see how it would help.
      All of those histograms and extra jit.pwindows seem unecessary for debugging as well..or at least, I would take them out when it comes to presentation time..they'll blast your fps like mad.
      Here is how I figure out my values for my inverse color organ..let's see if i can keep it making sense:
      My program works by making a different octave sound based on what dominant color it sees in each of the 8 segments of the screen. (I do the following 8 times for each note area) It takes the feed from grab, sends it into jit.3m, then just unpack from the mean value (the average value...). I ignore the alpha channel.
      Then I send the R G and B values into a series of if statements for each color. This ensures that two notes do not sound at the same time, but only the dominant one sounds. For example..when deciding about blue..I have to decide if it is greater than both green and red seperately. If they are in fact greater..they send a 1, if not then 0. I then add those 0's and 1's together. These go into a [select 2 1 0] if select sees a 2, then a note sounds, and if it sees a 0 or 1..nothing happens. This is all repeated for green and red. Where you have to test (in two seperate if statements...or thats how i do it) if the mean value is greater than both the other ones.
      I ran into problems when i tried to compute things like 'is green greater than red and blue' because it's difficult to scale properly..it's easier to just ask 'is green greater than red' and 'is green greater than blue' 'if both are true..sound the green note!
    • Apr 13 2010 | 3:31 am
      oh also..you'll most likely need a way to offset your webcam's tendency to color balance..they like to try and adjust their white balance which will throw everything off
      my program has a 'tolerance' addition that offsets a camera's color tendencies. In addition i have a way to normalize everything and say 'this color..what you're seeing right now..is white..so r, g and b are all equal'..that is essential or else you'll start off with unbalanced values that are too hard to predict
    • Apr 13 2010 | 4:16 am
      hey thanks, this is really helpful information. I've cut out the unnecessary stuff that you mentioned. I like your point about the white balance, so I'm going to try to work something into the patch.
      Really appreciate all your help. This is my first Max patch, so I'm kind of learning the program as I go (hence all the newb mistakes) :)