In this installment of the Video Processing System, we're going to tackle two big hurdles that Jitter users often find themselves coming up against. The first thing we will add is an improved, high performance video player module based around the poly~ object. This will allow us to load a folder full of videos and switch between them quickly and efficiently. The other module we will add is a simple recording module to capture our experiments. Since we are using OpenGL texture processing to manipulate the video, it is a little bit more complicated than just using jit.qt.record, but not by much.
- Download the patches used in this tutorial.
- Download all the patches in the series updated for Max 7. Note: Some of the features in this system rely on the QuickTime engine. Using AVF or VIDDLL video engines will not support all functions. On both engines you may see errors related to loadram or have limited access to file record features.
To use our handy poly~ abstraction, we need a host patch that interacts with it in the right way. To see how that works, check out the vps.mbank file. Without getting too far into the details of how this patch functions, let's look at the basic idea. Clicking the "Load" button triggers an Open... dialog that allows you to choose a folder. Once a folder is chosen, this filepath is set as the "prefix" for two umenus that are set to autopopulate. The umenu on the right will be used to load each instance of our poly~ abstraction. The one on the left will be our performance control to choose which movie to play. The subpatch called "populations" controls the creation of the correct number of poly~ "voices" and then iterates through the file list, loading a movie into each voice. Finally, the "menu-controls" subpatch provides an interface for stepping through the umenu using triggers. All of this hosted inside a bpatcher in our main VPS-4 patch so that we can interact with the controls without having to see the entire patch.
Obviously, this module might not do exactly what you want, but it is fairly easy to add more movie controls by altering the vps.pmovie patch. This should provide the basics that you can build on.
Eventually, you'll get to a point where you are really liking some of the video that is coming out of your video processing patch, and want to record some to disk. Since the patch was begun as a live performance tool, there hasn't been much discussion about recording to disk. However, since this is Jitter, we can always add in another module. One caveat I must mention is that it will be difficult to get full framerates while recording the output of this patch to disk, especially since we are doing most of the processing on the GPU. That said, you can often get perfectly acceptable results if sync is not of vital importance.
In order to get our video from GPU back into CPU-bound matrix territory, we'll need a way to rasterize the texture. This is done easily by using the jit.gl.asyncread object. This object read the contents of an OpenGL scene asynchronously to provide a rasterized image matrix. We can then connect that to jit.qt.record to record our video. As an added bonus, I've stuck in a simple interface for choosing video codec and framerate for recording.
That concludes this installment of the Video Processing System. Stay tuned for more...