Forums > Jitter

How to intercept video going to jit.window for use with Matrixset?

April 6, 2012 | 7:15 pm

Hi all, This is my first time posting to the forum but I have gained much information over the years here.

I would post a patch but it’s a bit too tangled and even if I cleaned it up it would be a rather complex sift through for anyone. But I will describe as best I can my particular goal and hopefully I can get to the bottom of what is the best method.

Basically I made app for the monome called 64(video)fingers. It allows you to trigger multiple video clips and record and play back patterns based on your button presses. It works pretty well and I have recently expanded it to show multiple video layers at once (up to 5) depending on how many buttons you have pressed.

One function it is lacking is the ability to switch banks of video clips smoothly while clips playback. It’s understandable that the preset switch would disrupt playback since it’s creating a spike in disk activity. So my idea for a solution to this is to use the Matrixset object to record a timed loop of the video into RAM so that it can (hopefully) playback smoothly while the new bank of video clips are being loaded into they’re slots.

This obstacle that I am running into:

I ultimately want to record what is being rendered to my named jit.window object. I am currently sending multiple jit.gl.videoplanes to the jit.window (ie. the multiple layers of video). I need to intercept that feed in order to get it going through a Matrixset object before ending up in a window object.

One avenue I decided to take was, instead of rendering to a jit.window object, to render to a jit.matrix object.

Problem is:

1 – I was getting an error "jit.gl.texture: error attaching texture to readback mechanism for capture!"
2 – it also said it was switching to "using software renderer" which I believe means perhaps that the GPU is no longer being utilized to process the video. (not sure if this is entirely true but I have think that may be what it means.)

So I guess my ultimate question is: What is the best way to intercept the video feed going to the jit.window in order to record it? I know I ultimately have to send the video through an asyncread object in order to rasterize the video before going into the Matrixset but how do I intercept that final feed before it gets to the jit.window.

I hope I’ve done my best to illustrate my concept here. I really appreciate the help!

Charlie


April 6, 2012 | 10:25 pm

Or I guess another way of asking is how would I collect all the videoplanes on to a surface before sending it to the jit.window?



dtr
April 7, 2012 | 8:56 am

jit.gl.node and set it to render to texture. send texture to 1) recording matrix and 2) another videoplane spanning the whole render window.

btw, doesn’t asycnread do what you want, grab the render output while still displaying in the window.


April 8, 2012 | 9:17 am

Oh snap! asyncread does work! Doh. Headsmack.

Matrixset is recording and playing back beautifully, unfortunately the video is still stopping playback when I load new presets banks, even through It’s playing from the RAM.

I believe it has something to do with my metro objects freezing while the pattr go to work replacing all the files.

hmm, I will have to research preset changes and metros freezing.

Thanks for the help dtr!


April 8, 2012 | 3:54 pm

you really don’t want to load videos while you are playing back.
there is very little you can do to eliminate the stall in playback while reading from the disk.

your best option is to have all your videos "pre-loaded" using something like poly~

how are you currently managing your videos?


April 9, 2012 | 3:04 am

@Robert, yeah you’re absolutely right. I’ve been thinking about this since I learned that all the metro’s were freezing while loading up.

I knew about poly-for-movies back in the day when I first made this app, but I think for reasons dealing with how I designed the pattern recorders to, sort of, dictate the video layering hierarchy, I couldn’t get it to work with poly.

Now that I’ve added a multi-screen functionality, the layer hierarchy is now simpler and based only on a group number you assign each video clip. There’s a total of 5 groups so that might coincide pretty smoothly now with 5 different poly objects that I can patch into. So yeah, I think I’m going to change the patch over to poly~ finally.

The way it works now is 56 instances of a "slot" bpatcher which contains both a jit.qt.movie object for the video clip and a groove object for the audio clip. You drag and drop the files over the slots to load in the video and audio. So whenever you change the preset it’s completely understandable that the transport freezes because 56 jit.qt.movies and 56 groove objects are being loaded with new content.

If it were poly~ based all that would be changing, i believe, is the name of the video clip in the message box thats being sent for poly to call up.

I do have one question, although perhaps the info is out there. Is there a limit to the amount of files you can have in the folder which poly~ is referencing? If I have hundreds of video and audio files would that cause me some problems in working with poly~?

Thanks a lot for all the help!


April 9, 2012 | 3:14 am

you’re only limited by memory. once max gets around 2GB, things will start getting wonky.


April 9, 2012 | 5:54 am

Awesome! Thanks for the help. I’m also excited to see what kind of performance boost I may get from switching to poly.


April 10, 2012 | 7:39 am

Robert, I have been back to researching here on the forums and I have come up with a couple more questions if you wouldn’t mind giving me your best advice? I certainly appreciate it.

The first concern deals with the idea of having five duplicate poly~ objects each loaded with the same folder of clips. You see, I’m hoping for 5 discreet video outputs (from each of the 5 duplicate poly~ objects) to be sent into my routing patch which arranges each of the video streams into they’re multi-screen configurations. My fear is that with 5 duplicate poly~ objects each loaded with the same folder of clips I might run into some memory issues. Because, I am to believe, that I have loaded each video clip multiple times into memory.

I see that you have recommended the use of the Forward object in coincidence with the Multi-head jitter example on other forum posts.

I guess I’m just having trouble visualizing how to use this multi-head patch with a poly~ object. I imagine I should keep the Forward object as well as the metro/counter/jit.qball objects out of the poly~ object. Or would you keep the Forward object inside of the poly~ as there would only be one "Frank" output at a time and one "George" output at a time? (using the naming scheme from the multi-head example here)

Also, is this the best method for outputting five separate video streams derived from the same folder of clips?

I apologize if my questions are too convoluted? I haven’t started patching yet. I’m trying to devise my plan of action at this point.

Thanks!


April 10, 2012 | 2:40 pm

Hey, Charlie, good ta see ya.

(forward object would go in poly~… but you don’t need jit.qball, you can just drive the movie from within poly with an internal qmetro… and you don’t need forward either necessarily(i found it adds a cpu hit for some reason when using it in this attached patch, so i took it out and simply gate out of poly~ directly)…. better to show an example…)

been meaning to try and demo you a polymovie version of 64fingers but never got around to making it til now. attached here is a poly~ demo that simulates 64 fingers using matrixctrl(i took the ideas from the poly movie and multihead examples and combined them in a way…).
i also recommend having a way to drop a folder of 64 files all at once… that way people can just arrange the names in the folder, without having to drop one at a time… that’s in this patch, too. (the matrixctrl will go through the folder from start to finish, ordered on its cells running left to right and then top to bottom… hope that makes sense…)
the randomizer in this patch will demo it for you, picking a random movie out of 64 and sending it out of a random channel out of 5(turning off movies appropriately to conserve cpu).
but this is just a starter, you can easily work in different channels for each movie individually by focusing on the "target" message to poly~. (by being specific you can alter this patch so that different movies play out different channels at the same time…you may even want to alter it so that it doesn’t send ’0′ to stop the previous movie… as opposed to just setting them all at once and stopping previous plays as i’ve done)

oh, and you should leave the option "pause movies when not viewed" ON at all times in this patch(i should’ve just taken the option out and perma-initialized it within the poly~, but oh well…)

Remember to keep the total size of the folder of movies you drop in less than 2GB! (even less than that if you intend to drop audio separately ;)

hope it helps.


April 10, 2012 | 3:42 pm

Whoa! Thanks Raja! I will check this out today. Yeah have been planning on changing it to work with a single folder full of clips but even more than 64. The goal is to be able to change presets on the fly. I planned on being able to select the audio and video clips via a dropdown menu from the slot. The only reason I’ve made the audio and video clips separate is to be able to send the audio to different tracks in Ableton. With the jit.qt.movie the audio is hard-wired to tge DAC and spigot never seemed to work so well.

I am definitely super excited to check out this patch but I have to head off to work so I have to wait! Damn!!!


April 10, 2012 | 4:23 pm

if you don’t need individual movies playing to simultaneous multiple outputs unsynchronized, then it’s simply a matter of a single poly~ instance with 5 outputs and a single qt.movie going to a matrix with 1 input and 5 outputs.

if you do need that ability, then you will have to come up with a more involved solution. the forward trick may help you out here.

it’s definitely worth the effort in patching if you have 100s of videos, to avoid the duplicate loading, IMO.


April 10, 2012 | 6:56 pm

Yeah, I think I will probably need the Forward functionality. I think Raja’s patch is just shifting a single video feed to 5 separate outputs but I will definitely need 5 or 4 (depending on performance) individual streams to separate videoplanes. Although this patch has so much to learn from. Especially in regards to how to trigger the poly~.

Thanks so much for doing that for me Raja!


April 10, 2012 | 9:01 pm

Ya try what Rob said… I had trouble using this patch with forwards, the CPU started to spike past 100%( if you give it a shot and post here, i bet someone else can figure out a way…(posting patches gets things done quicker here… can refer to more specifics)) but for the patch i gave, you can also do it just fine by being specific with target message to the 2nd inlet of poly~ and removing the extra part just above poly~ that keeps track of the most previous play(sending 0 to stop the previous everytime a new one is triggered). if you removed this, for example, and then sent something like "target 1, 5" to the second inlet, you’d be telling voice 1 of poly to go out channel 5, and because you removed the previous-play-stop thingie, you could then send "target 2, 3" to send voice 2 of the poly~ out channel 3, while voice 1 would still be playing out channel 5. what i would do then, is work out muting poly~s from within the poly~(either the end of the movie would trigger stopping of qmetro and muting poly~, or perhaps a more involved way would be to stop it soon as another voice is triggered to go out the same channel).

Anyways, so many ways to do this.

The thing to remember is that poly~ voices can be treated like individual instances, you just need to message them individually.

Hey, my pleasure.
Monome community! :D


April 10, 2012 | 9:47 pm

Oh, sweet! Thanks Raja. The good news is that my "slot" patch already has some of the functionality for sending plays and stops. I will just have to translate those messages into messages poly~ understands.

I’m sure I will get sufficiently lost in this once I actually begin patching.

Thanks again!


Viewing 15 posts - 1 through 15 (of 15 total)