Forums > Jitter

How do jit.gl objects manage multiple GPUs?

May 14, 2013 | 11:11 am

I have a Mac pro I use for color grading work with DaVinci Resolve, this software required me to configure this machine with 2 GPUs, one that handles the computer displays and one more powerful that is used by the software to do the color work… In my case I used a simple GT120 to power the displays, and a Quadro FX for the workhorse processing unit (no screens plugged into it).

just recently I began using this machine for some performance work as well, using Jitter. And I’m curious to know how gl objects deal with multiple GPUs, do they see only one? and if so, which one? Can I force Jitter to use the more powerful one, or does it already do this on it’s own? also, where should I be plugging my projector(s) in? in the quadro dvi ports or the GT120 ports?

thanks!



Nat
May 14, 2013 | 11:28 am

I’d be curious to know that also as I plan on building a rig soon.


May 15, 2013 | 4:39 pm

Max/Jitter is one of many possible users/customers of OpenGL on your computer. Max/Jitter does not at this time support OpenGL, CUDA, or other GPU general computing standards.

On your computer, there is likely a method for you to define what video cards are used for given displays and/or programs. It is this configuration which will define what OpenGL uses…and therefore what Max/Jitter uses.

The GT120 is a 3 year old product supporting OpenGL 3.0 with an 8.8 texture fill rate.
The Quadro FX might ber better or worse. I can’t recommend because there are many FX models.


Viewing 3 posts - 1 through 3 (of 3 total)