ganz graf vh mod 2-1 - m4l device modification tips needed

    Mar 16 2013 | 9:34 pm
    I'm currently prepping my Live 9 set to use the ganz graf vh mod device for realtime visuals.. my question is: the only way to rotate the 3d plane/xy of the visuals is by moving the mouse over it to shift it around. i'm trying to figure out how to change that to be controlled by live dials/macros like the other aspects of the device. any thoughts would be wildy appreciated... device: video of me testing with device:

    • Mar 17 2013 | 12:11 am
      ha ! you again ! Hope the edit i've made is working as expected...
    • Mar 18 2013 | 5:00 pm
      hey there! nice, i've been fiddling with learning gl context, rendering and such... i really want to have multiple video windows of this and send different session view tracks to trigger visuals from percussive elements and synth/atmospheric sounds as well.
      but i can't seem to nail it down, Synnack emailed me this: "All of the gl objects are named "sample" in this device. So if you have two of the devices, you have to things rendering to one "sample" gl context"
      any thoughts/tips on getting this device to have at least 2 instances/rendering windows in a Live set. not sure if it'll even project properly..
    • Mar 18 2013 | 6:31 pm
      As of Max 6, you can avoid naming the rendering contexts and corresponding OpenGL objects, and it will simply find the first one in the same patcher. So you should be able to get this working by simply removing the name "sample" from all objects.
      Keep in mind when using the @sync attribute which is on by default for jit.window, that you may have slowdowns for multiple windows, and may wish you use one final output window which composites from other sources.
    • Mar 18 2013 | 7:50 pm
      Thanks for that info ... i actually went thru, it was simple enough, and changed the gl context window names. and so, i have 2 of the same devices on send A and send B in Live 9, with their own rendering window.. so i can send any audio i want to trigger them. it looked solid, no glitching out, or latency problems.
      now, i'll be testing it with a projector this week, to see if it'll look good for my upcoming show. of course, there will be an obvious 2 window split because i can't full screen both of them. i think it'll be sweet.
    • Mar 19 2013 | 9:37 am
      As Joshua wrote, why don't you mix the two rendering contexts in one window ?
    • Mar 19 2013 | 5:31 pm
      "As Joshua wrote, why don't you mix the two rendering contexts in one window ?"
      that would be super amazing. being new to editing/building, i have no idea where to start for this to happen. and the learning process continues!! do you have any quick tips on the matter ? thanks for chiming in!
    • Mar 19 2013 | 7:35 pm
      there are currently massive old and new threads discussing rendering multiple gl to a single output....wowza
    • Mar 20 2013 | 5:02 am
      today i stacked 3 of these devices one after the other on one audio track and wow.. interesting. technically, i could have one on the master. which is the first and so it becomes the final visuals output. and then add, say, 1 on send A & send B, with different sounds going to each, which would trigger different rendering reactions, which would all show up on the master window.