Dec 15 2015 | 2:46 am
    Hi guys - I'm pretty new to Max so this may be quite a simple idea.
    I''ve bult an additive synthesizer and I'm looking to use the jit.object(s) (for video-based colour detection) as a means of triggering user specificied notes on a kslider. I found a thread on here from a while back where someone managed to implement red, green and blue colour detection, my problem however is I'm not sure how to build colour detection for other colours such as yellow / purple / orange etc. I have tried numerous different ideas like obtaining the RGBA values from an exact colour - however when I implement these into my patch it doesn't work. I have also tried switching from RGBA to ARGB (for the jit.findbounds object), but his also had no effect.
    Are there set values for particular colours that can be implemented, or is there a more complex issue concerning how the colour values need to be translated for the jit.findbounds object?
    Any help on this would be really appreciated - I'm all out of ideas at the moment, and so were my tutors!

    • Dec 15 2015 | 3:27 am
      Detect the color of what exactly? A single pixel? A shape? A blob? The average of a given region? This can be trivial to.. not-so-trivial - depending on what you are doing. For reference, RGBW and CMYK correspond to the eight possible 3-bit binary numbers: 111 to 000 (ex. yellow = 110) For specific color names and values, check out:–F
    • Dec 15 2015 | 3:54 am
      In this instance a colour card (approx. 10cm height, 5cm width). I'm basically trying to build a patcher that plays a note/notes upon detecting a particular colour, then ceases to play when the colour is no longer present.
    • Dec 15 2015 | 3:57 am
      (Also thanks for getting back to me so quick!)
    • Dec 15 2015 | 5:06 am
      Okay, to clarify... it's a physical card and you want to read the color in Max using a camera... and then compare the value to a specific set of colors for the closest match.
      The first thing that comes to mind is lighting. You can get VERY different values depending on your light source, whether the card is tilted, there is glare, etc etc. For best results, the cards should be read at a fixed distance with identical lighting. The more control you have, the more accurate your results. A simple card holder for fixed distance and lighting that is near the card should work fine. Beware of background lights, spots, etc. hitting the card when it's being read.
      In terms of Max.. you can use a simple color difference calculation (CIE delta E 1976) to calculate the perceptual "distance" between two colors. Sounds fancy but it's easy. In this case, you might have a [coll] object with a list of reference colors and use the camera input as your sample. Dump the coll values, comparing each one with your sample. Grab the color with the lowest delta E value. There are a few more details I left out, but that's the basic idea. I can post some tools to make it easier if this jibes with your overall plan.
    • Dec 15 2015 | 5:21 am
      Yeah it's a physical card.
      If you wouldn't mind that would be really helpful. Thanks!
    • Dec 15 2015 | 7:06 am
      You still have a lot to work out in terms of implementation but here's a start...
      1. [coll] to store reference colors. 2. [jit.grab] for video image and [jit.3m] to get the average RGB value 3. Dump the [coll] to compare each value with the video sample. 4. Convert all colors to L*ab 5. Calculate the color difference between the references and the sample. 6. The lowest delta E value is the closest reference match to the sample
      Be sure to position the camera and/or crop the image so the card color fills the entire frame - otherwise you will be averaging background colors. You can use [jit.submatrix] to do that.
    • Mar 31 2020 | 3:40 pm
      I know this was ages ago that you posted this but I think you can help Metamax!! I am working on a project that I think your example above could help with some tweeking... Would you know how to miss-match that patch to actually show the color that is the closest to the video sample? So instead of picking a color, the info from the video sample would pick the closest from the coll colorlist for you?
    • Apr 10 2020 | 8:43 pm
      The zl.sub object does the heavy lifting for you.
    • May 15 2020 | 10:49 pm
      I am another person this has helped in 2020, thanks Metamax! I used this as a base for identifying colour from a TCS34725 RGB colour sensor. I added a bunch of other features to adjust for sensitivity, specific to this sensor. I think when I have time I'll post it either on my website, or on github. And I'll definitely included a link to this thread to reference where the initial idea came from.