The Video Processing System, Part 4

    In this installment of the Video Processing System, we're going to tackle two big hurdles that Jitter users often find themselves coming up against. The first thing we will add is an improved, high performance video player module based around the poly~ object. This will allow us to load a folder full of videos and switch between them quickly and efficiently. The other module we will add is a simple recording module to capture our experiments. Since we are using OpenGL texture processing to manipulate the video, it is a little bit more complicated than just using jit.qt.record, but not by much.
    • Download the patches used in this tutorial.
    • Download all the patches in the series updated for Max 7. Note: Some of the features in this system rely on the QuickTime engine. Using AVF or VIDDLL video engines will not support all functions. On both engines you may see errors related to loadram or have limited access to file record features.

    Tutorials in this series:

    A Better Video Player

    Anyone who uses Jitter for video playback for a long enough time finds themselves building a poly~ or JavaScript based movie-player bank. The implementation we will look at here is based on one that I have used numerous times in performance situations. The reason for building such a module has to do with the drop in frame-rates that can happen when you send the read message to jit.qt.movie. By preloading a whole mess of jit.qt.movie objects with different videos, you can bypass the lag that is caused by loading a new file from disk. To do this, we will make a poly~ abstraction that includes a jit.qt.movie object and several messages to control the movies. To see what this abstraction looks like, open up the vps.pmovie file. This patch provides an interface to the jit.qt.movie object through various messages. When the "file" message is sent, it loads the specified file using an "asyncread" message. Once the file is loaded, it triggers a "getduration" message which allows us to calculate times for the "loadram" message. The way it is currently set up, we automatically load the first and last 2 seconds of each video into RAM (assuming a 600 timescale). This helps to prevent the framerate drop that often happens when a video loops back to the beginning.
    To use our handy poly~ abstraction, we need a host patch that interacts with it in the right way. To see how that works, check out the vps.mbank file. Without getting too far into the details of how this patch functions, let's look at the basic idea. Clicking the "Load" button triggers an Open... dialog that allows you to choose a folder. Once a folder is chosen, this filepath is set as the "prefix" for two umenus that are set to autopopulate. The umenu on the right will be used to load each instance of our poly~ abstraction. The one on the left will be our performance control to choose which movie to play. The subpatch called "populations" controls the creation of the correct number of poly~ "voices" and then iterates through the file list, loading a movie into each voice. Finally, the "menu-controls" subpatch provides an interface for stepping through the umenu using triggers. All of this hosted inside a bpatcher in our main VPS-4 patch so that we can interact with the controls without having to see the entire patch.
    Obviously, this module might not do exactly what you want, but it is fairly easy to add more movie controls by altering the vps.pmovie patch. This should provide the basics that you can build on.

    Keeping a Record

    Eventually, you'll get to a point where you are really liking some of the video that is coming out of your video processing patch, and want to record some to disk. Since the patch was begun as a live performance tool, there hasn't been much discussion about recording to disk. However, since this is Jitter, we can always add in another module. One caveat I must mention is that it will be difficult to get full framerates while recording the output of this patch to disk, especially since we are doing most of the processing on the GPU. That said, you can often get perfectly acceptable results if sync is not of vital importance.
    In order to get our video from GPU back into CPU-bound matrix territory, we'll need a way to rasterize the texture. This is done easily by using the jit.gl.asyncread object. This object read the contents of an OpenGL scene asynchronously to provide a rasterized image matrix. We can then connect that to jit.qt.record to record our video. As an added bonus, I've stuck in a simple interface for choosing video codec and framerate for recording.
    That concludes this installment of the Video Processing System. Stay tuned for more...

    by Andrew Benson on
    Sep 23, 2009 7:27 PM

    • drvortex
      Jan 01 2010 | 12:57 am
      hi i need help i would like to process images before they get combine. I got a gl blur to work. But tried to apply the your parts before they were combine and they do not work. thanks david
    • Andrew Benson's icon
      Andrew Benson's icon
      Andrew Benson
      Jan 06 2010 | 5:22 pm
      Just make sure you add your jit.gl.slab objects after the 'texture-conversion' to make sure you aren't still in uyvy colormode. If you are still having trouble let's continue the discussion on the forum.
    • jfenwick's icon
      jfenwick's icon
      Jan 17 2010 | 12:22 am
      This patch looks really great, but I'm having a problem with it. When I click the Load button I am unable to select any movies.
    • Andrew Benson's icon
      Andrew Benson's icon
      Andrew Benson
      Jan 19 2010 | 5:18 pm
      You will need to select a folder with several movies in it. If you have further trouble, let's continue the conversation on the Jitter forum.
    • Jorgen Teller's icon
      Jorgen Teller's icon
      Jorgen Teller
      Jan 26 2010 | 3:40 pm
      does this also work with jit.grab
    • Andrew Benson's icon
      Andrew Benson's icon
      Andrew Benson
      Jan 27 2010 | 12:54 am
      Hi Jorgen, the device includes a jit.qt.grab - based camera input section. On Windows this can be replaced with jit.dx.grab for greater compatibility.
    • gio's icon
      gio's icon
      Mar 01 2010 | 8:53 pm
      hey Andrew I've been reading almost all your articles on the site lately.. thanks, it's really amazing the amount of awesome things you are sharing. I have a doubt here.. what if after all this processing done with the shaders on the GPU I wanna insert something like your Debris example (Recipe 27 ;-) in the chain ? Is it gonna fuck the fps the fact of moving back to CPU ? The use of matrix operations and shaders in the same processing chain is something I haven't understood yet.. thanks.
    • VJ Fader's icon
      VJ Fader's icon
      VJ Fader
      Feb 05 2011 | 2:55 pm
      Hi Andrew:
      I've been having trouble compiling this example into a standalone on PC, wondering if you can help. I just created a thread on the forum, here is the link: https://cycling74.com/forums/can-not-compile-vps-4-example-patch-into-standalone-app-in-windows
      thanks in advance.
    • zhiwan's icon
      zhiwan's icon
      Apr 25 2011 | 10:38 am
      Hey Andrew,
      This tutorial is very cool. One thing I noticed is that when I load a movie in your patch, I still notice a drop in framerate when the movie loops back around. Why is that? Is there something wrong with my movie?
    • efw's icon
      efw's icon
      Oct 02 2011 | 5:17 am
      for a project we want to connect a camera with infrared with motion detectors and light could anyone point me in a direction? our teacher told us to look into max msp but we don't know where to start.. Thank you
    • crasszorro's icon
      crasszorro's icon
      Jan 21 2012 | 10:44 am
      Hi Andrew,
      I'm very new to Max and I'm working specifically with M4L. The VPS you have provided---will they work in M4L? If so, is it necessary for me to copy the contents of the patches into a new M4L device, paste them and save them? And if so, is it necessary for me to add inputs and outputs into what you have already built? I greatly want to use your devices----any direction you could give me would be greatly appreciated.
    • loadmess's icon
      loadmess's icon
      Jan 28 2012 | 10:59 pm
      Hi, Is there a way to capture the picture of several quicktime inside poly~?
      Great tutorial, thanks!
      best, diogo
    • loadmess's icon
      loadmess's icon
      Jan 29 2012 | 5:27 am
      Sorry, I should had more details in the previous question just to make sure. So imagine, if several QT objects are playing at the same time inside one POLY~, how would I MIX or SUM the output of each active QT instance playing?
      thanks, dc
    • johannes's icon
      johannes's icon
      Sep 15 2013 | 9:56 pm
      is anybody not having the problem that there is a little annoying text window when sending fullscreen 1 to jit.window? it is described at the end of this topic, too:
      with the vps-4 patches I can reproduce this always that's why maybe someone around knows how to get rid of this little floating window. I can not figure out why it is there...
      best, johannes
      here is a workaround:
      Max Patcher
      In Max, select New From Clipboard.
      Show Text
    • Gary Lee Nelson's icon
      Gary Lee Nelson's icon
      Gary Lee Nelson
      Oct 09 2015 | 6:32 pm
      This is almost 100% clear and very very helpful. For some reason all of my images including the recordings have small vertical strips rather like corduroy.
      Is there an example that shows how to use a poly~ to play several movies at once?