processing Quake with Jitter in real-time

genfu's icon

Okay,
so I've got some basic midi control over Quake. Want I want to do next is process the games graphics in realtime using Jitter.

Is this reaslitically possible? I have 2gig core 2 duo laptop running windows XP with 2gig of ram. Would I be able to run the game and process the video on the same machine, or would I realistically need to process the game on one machine and send the video output to another computer to process the graphics in real time?

How would you go about this, in terms of feeding the visual output of the game to Jitter, and generally?

Any pointers appreciated...

yair reshef's icon

are you using http://ccrma.stanford.edu/~rob/q3osc/ ?
if it was just control i would say do it on another system and control via
OSC>net, but for graphics i would use jit.desktop on a dual monitor setup.
you didn't specify your graphic card but just test and see is acceptable.
don't run it full screen start at 640x480 and go from there..
On Sun, Feb 8, 2009 at 7:35 PM, Jon Weinel wrote:

>
> Okay,
> so I've got some basic midi control over Quake. Want I want to do next is
> process the games graphics in realtime using Jitter.
>
> Is this reaslitically possible? I have 2gig core 2 duo laptop running
> windows XP with 2gig of ram. Would I be able to run the game and process
> the video on the same machine, or would I realistically need to process the
> game on one machine and send the video output to another computer to process
> the graphics in real time?
>
> How would you go about this, in terms of feeding the visual output of the
> game to Jitter, and generally?
>
> Any pointers appreciated...
>

ico's icon

> Okay,
> so I've got some basic midi control over Quake. Want I want to do next is
> process the games graphics in realtime using Jitter.
>
> Is this reaslitically possible? I have 2gig core 2 duo laptop running
> windows XP with 2gig of ram. Would I be able to run the game and process
> the video on the same machine, or would I realistically need to process
> the game on one machine and send the video output to another computer to
> process the graphics in real time?
>
> How would you go about this, in terms of feeding the visual output of the
> game to Jitter, and generally?
>
> Any pointers appreciated...

Theoretically this is possible but depending on what exactly you are trying
to do, it may take a considerable amount effort at retrofitting Quake for
this purpose. Since Quake is traditionally an OpenGL program you could
employ the render-to-texture approach and then expose memory allocated for
texture capture to Jitter. This is by no means trivial but is doable...

If you are simply looking into importing Jitter texture into a game engine,
we're about to release Max-Unity3D interoperability toolkit which does
exactly that. FWIW, Unity3D is also light years ahead of Quake in terms of
sophistication and features...

Hope this helps!

Ivica Ico Bukvic, D.M.A.
Composition, Music Technology
Director, DISIS Interactive Sound & Intermedia Studio
Assistant Co-Director, CCTAD
CHCI, CS, and Art (by courtesy)
Virginia Tech
Dept. of Music - 0240
Blacksburg, VA 24061
(540) 231-6139
ico@vt.edu
http://www.music.vt.edu/faculty/bukvic/

Jack Stenner's icon

On Feb 8, 2009, at 1:05 PM, Ivica Ico Bukvic wrote:

> we're about to release Max-Unity3D interoperability toolkit which does
> exactly that.

Now THAT'S exciting to hear!

Jack

genfu's icon

Quote: yair r. wrote on Sun, 08 February 2009 10:58
----------------------------------------------------
> are you using http://ccrma.stanford.edu/~rob/q3osc/ ?
----------------------------------------------------

No, I've seen that, but I'm using the original Quake, and playing around with some of the updated mods like Darkplaces and Fitzquake.

I've got midi being transfered from max/msp/jitter to buttons which the game recognises as inputs, and then those buttons are bound to change parameters, so for example I can oscillate FOV, and theoretically manipulate other game parameters in max.

The idea of what I wanted to do was then also try to get some feedback from quake going back to max/msp to indicate basic game events (perhaps by getting max/msp to read the games log-file?).

I then wanted to create graphical changes to the 'final image' of quake that you see displayed (so as if a live feed of the game were sent to jitter for post-processing). Much like this Street Fighter video, which gave me the idea:
http://www.zacharyseldess.com/streetfightervids.html

Then of course in theory the changes could be linked to information from the game, and forced changes which I impose on the game via the control I have in max/msp. The idea being at the end to make an interactively psychedelic version of quake, which isn't just a graphics filter since I can do things like slow down the game also in tandem with -say- blurring the graphics... so broadly speaking it doesn't just make it difficult to see whats going on.

genfu's icon

thanks for the responses

how would I go about getting the video from Quake into Jitter? Is there some way for Jitter to recognise Quake jit.dx.grab object? Or...? Can this be done with Jitter alone or would Quake need to be modified somehow in order to send the video?

genfu's icon

oops, missed the mention of jitter.desktop there before i asked that.. i think i'll give that a go for starters then and see how it goes.. thanks

ico's icon

As another person has pointed out earlier, the only way to do it without
major changes to the way how Quake renders the scene is to use jit.desktop
object and capture on-screen desktop pixels into a texture (something that
will in all likelihood result in a considerable CPU overhead).

Ico

> -----Original Message-----
> From: jitter-bounces@cycling74.com [mailto:jitter-bounces@cycling74.com]
> On Behalf Of Jon Weinel
> Sent: Sunday, February 08, 2009 2:29 PM
> Subject: [jitter] Re: processing Quake with Jitter in real-time
>
>
> thanks for the responses
>
> how would I go about getting the video from Quake into Jitter? Is there
> some way for Jitter to recognise Quake jit.dx.grab object? Or...? Can
> this be done with Jitter alone or would Quake need to be modified somehow
> in order to send the video?
>

yair reshef's icon

Ivica, that Unity bridge sounds like a great project, hope it goes smoothly.

as for jit.desktop being a resource hog, that depends on the setup. my lowly
e4600 intel xp desktop takes a 15-17% cpu hit capturing 800x600 at 20fps.
not that bad. i used better rigs with a lesser hit.

sidenote, lately i needed to share a single video feed between eyesweb and
max, i used a freeframe video streaming plugin in memory sharing mode. if
you follow the instructions carfully it works great.
http://wiki.bigfug.com/FreeFrameVideoStreaming

On Sun, Feb 8, 2009 at 12:19 PM, Ivica Ico Bukvic wrote:

> As another person has pointed out earlier, the only way to do it without
> major changes to the way how Quake renders the scene is to use jit.desktop
> object and capture on-screen desktop pixels into a texture (something that
> will in all likelihood result in a considerable CPU overhead).
>
> Ico
>
> > -----Original Message-----
> > From: jitter-bounces@cycling74.com [mailto:jitter-bounces@cycling74.com]
> > On Behalf Of Jon Weinel
> > Sent: Sunday, February 08, 2009 2:29 PM
> > Subject: [jitter] Re: processing Quake with Jitter in real-time
> >
> >
> > thanks for the responses
> >
> > how would I go about getting the video from Quake into Jitter? Is there
> > some way for Jitter to recognise Quake jit.dx.grab object? Or...? Can
> > this be done with Jitter alone or would Quake need to be modified somehow
> > in order to send the video?
> >
>
>

ico's icon

sidenote, lately i needed to share a single video feed between eyesweb and
max, i used a freeframe video streaming plugin in memory sharing mode. if
you follow the instructions carfully it works great.
http://wiki.bigfug.com/FreeFrameVideoStreaming

You're right, my criticism of jit.desktop was indeed too harsh. BTW, thanks
for point out the FreeFrameVideoStreaming project. This is indeed very
interesting!

Best wishes,

Ico

genfu's icon

Well with an intial basic test, it appears to work (with jit.desktop):

of course piling on more effects may grind it to a halt