using max jitter for multi screen video wall
I am wondering if max jitter would be the proper tool to use for displaying content on a video wall made of 6 plasma screens 0f 52 inch diagonal each. I need to send various outputs which will vary during the presentation – a different visual image for each screen, a single visual on all screens as a whole, a visual that will move between the screen and possibly other variations.
I am thinking of using my macbook pro core duo 2.5 Ghz which has a dual display capability? will that be eough? do I need other hardware? I’d like the signal I am sending the screens to utilize the highest resolution they are capable of. I will be using videos shot at full HD 1920x1080P as content to work with on max. there will be minor visual manipulation of this content. there will be no audio at all.
thanks a lot
I have never used max map jitter but have worked extensively with good old discontinued keyworx
I think you should take a look into the gl objects inside jitter, in particular jit.gl.videoplane and jit.gl.render. You can try to use a shared opengl context and spread the resulting image using a splitter like the matrox triplehead2go. However, i am not sure if you can use 6 different outputs this way.
In the case of the effects, take a look into the shaders that max include:
/Max5/Cycling ’74/jitter-shaders/ and play with them using jit.gl.slab together with jit.gl.videoplane.
Finally, I am not sure if a MBP with the specs you describe can manage such amount of HD movies. Maybe reducing the resolution or using the proper codec like prores(do you need your alpha channels?, otherwise you can try prores proxy)
Hope it helps
Is this kind of thing done with max at all? is it common use? would max be the proper tool for that? if not, which tool may be better?
What kind of Mac specs would support such output and what additional HW is capable of handling it in a stable manner?
In most cases I would just need to play the videos in different configurations on the 6 flat screens.
What kind of visual or effects limits would I have if I do not need the alpha channels? would no alpha chanek make the task more manageable?
Which codecs are the prores proxy
If you want a single station with 6 video outputs you will need to install specific hardware. For instance:
Another option is to use a video splitter like the matrox triplehead2go.
With a correct design and using shaders as your main tool it would be possible to optimize the system performance. However HD is quite demanding. I have worked with prores with rather nice results:
Take a close look to the different versions of the codec; I have worked mostly with 422 and proxy with excellent results.
Alpha channels are used to blend the images, for instance when doing transitions between videos:
Again, depending on which kind of system you want to use you can decide which application is useful for you. For instance, Jitter is(IMHO) really good for manipulating video and customize visual effects.Quartz Composer and VVVV support quite well the use of multi screening, one with the use of an application called Quartz Composer Visualizer and the other with the creation of diverse renderers. Of course another option is to go for commercial VJ software like VDMX or Resolume.
You could also use multiple (6 in your case) computers connected via router, and use udpsend/recieve to send a control signal/command changes to keep the videos in sync. If you need more info, I can send you the patch I use for syncing max patches.
are you using the java classes?
I made a set of standalone apps (called "MultiScreener") for multi-screen synchronization (using the maxhole java object). The download is here (with source patches included):
NOTE: This doesn’t really answer the original poster’s question since MultiScreener only syncs movies, not dynamically rendered content. But the code might come in handy!