alright i am just going to try my hardest to be clear and understandable (i appreciate it in advance... )
i have been putting together a live media project which i want to be finished with before the end of the semester (6 weeks and counting), and so i've been learning the basics about open gl, well, trying to anyway, and right now i am trying out the different rendering destinations. that chapter of the tutorial makes sense to me, but i am having trouble with the render to a matrix. what i would like to do is use the opengl with a jit.qt.grab. at first i was going to make a texture on the opengl using the live grab, but i'm not sure how to do that because the opengl that i am using is basically everything from the example patch called "plur", and the opengl texture tutorial does not show a way of using live video feed as a texture. so now i am just trying any possible way to use both the opengl and the jit.qt.grab simultaneously, whether thats with mathematical operations for blending, or crossfading with xfade, or whatever. so in order to do that, i think it is necessary to render the opengl in a matrix, right? well i can't seem to get the opengl to transfer from the jit.window to a jit.matrix anyway, i don't know why. and also, the live grab (which is from another patch i made specifically for my new firewire camera, and from which a nice uncompressed image quality with camera driver settings are located) went to complete doo-doo when i sent it to a jit.matrix....
so! not sure if that is possible or not. i would love it if somebody shared a method for using a live video feed and open gl together, or at least pointed me in the right direction! the only reason that i plan on using the plur example is, well, cos its awesome number one, but also it seems like it would take some time to put together a way of using even simple polygons and shapes like that which can be controlled and moved and so on.. time that i don't really have right now... so thanks so much for any feedback!