Xfade in OpenGL / GPU

Dec 4, 2006 at 6:58pm

Xfade in OpenGL / GPU

Hi All,

I’m trying to build a patch to use my new MacPro as my main VJ rig. I got a patch and is works fine for now, but i’m getting more and more in Jitter and i’m trying to find out how to fade between movies using the GPU instead of jit.xfade that works most on CPU power. My MacPro has 3x a 256MB videocard, and i want to use it to hookon to mutiple beamers for widescreen projection.

I’m still learing every day, read a lot on the forum about GPU, but can’t find the right info. If some one can help me out or point me in the right direction…. THANX!!!
The patch is also here to see what goes on inside my head. And is some one can use it go ahead!

Cheers,
Harmen

#29038
Dec 4, 2006 at 7:10pm

Search the examples. There are plenty of Xfade on GPU solutions in
the examples folder in the documentation using alpha with
blend_enable or with shaders.

On Dec 4, 2006, at 1:59 PM, Harmen van ‘t Loo wrote:

> I’m trying to build a patch to use my new MacPro as my main VJ rig.
> I got a patch and is works fine for now, but i’m getting more and
> more in Jitter and i’m trying to find out how to fade between
> movies using the GPU instead of jit.xfade that works most on CPU
> power. My MacPro has 3x a 256MB videocard, and i want to use it to
> hookon to mutiple beamers for widescreen projection.

v a d e //

http://www.vade.info
abstrakt.vade.info

#89835
Dec 5, 2006 at 9:39am
#89836

You must be logged in to reply to this topic.