I want to build a AV sequencer - help!


    Jun 13 2018 | 6:45 pm
    Hi !
    I want to build a AV sequencer that can play about 50 video files at once (using alphablend). The movies are short animations that appear mapped on the screen and form a composition. These 50 files belong to 1 set, and I have more sets. I want to be able to change between sets instantly.
    How would I load and play such large amount of video's?
    I was thinking about using the HAP object and video textures.
    Can anyone point me in the right direction? Thanks!

    • Jun 14 2018 | 3:27 pm
      50 simultaneous clips sounds like a lot. i don't know if hap would help in this case but maybe. my recommendation to try first is to read all the clip frames into memory using either jit.matrixset or jit.movie @engine viddll loadram.
      the clips should all be exported at the dimensions you expect them to be displayed at. i would implement this in javascript for the most ease and flexibility while experimenting.
    • Jun 14 2018 | 7:35 pm
      Hi Rob. Thanks for your answer! I've no experience with javascript yet. ps: I'm working on a mac. Do I still use the viddll engine then?
    • Jun 14 2018 | 10:44 pm
      if you're dealing with 50 videos to form a composition, they could probably be super super small? I would load the videos inside jit.movie objects in 50 instances of a [poly~ 50] and then you load your new "set" by loading a new poly~ with the message {patchername blablabla]
    • Aug 15 2018 | 10:39 pm
      If I use poly~ , can all the files be different? (I have 50 unique movie clips)
    • Aug 17 2018 | 7:14 pm
      sure, use thispoly~ to get the instance number and load different videos
    • Aug 22 2018 | 9:59 pm
      I had this idea: instead of loading all 50 video per set / 150 video's total all at once, I can also use a movie file that has several clips edited behind each other. With the frame message I can play one clip in the movie, and change the frame count if I want to go to the next clip. I use one global qmetro object for timing.
      Does this sound like a possible solution? I hope this method of playback is efficient. I think I'll use the VIDLL engine or HAP. note: all clips will have transparent (alphachannel) backgrounds. note2: the final patch will have about 50 movie players.
    • Aug 23 2018 | 7:32 am
      sure that could work too! you'll still need 50 jit.movie objects to play your 50 videos simultaneously, so that might only affect your loading time if you're dealing with a big and long video file, make sure you're using hap codec and playing from a ssd drive!
    • Aug 23 2018 | 7:58 am
      yeah, we have a SSD drive and HAP. I tested it with more video's before. That works good until it doesn't. (max crashed) but now I only have to load about 50 instead of 150 :) , so hopefully this will help. The video's play in a sequencer, so not all at once (but it can get fast and chaotic, when all buttons are pressed)
    • Aug 23 2018 | 8:01 am
      you could try the argument @unique 1 for your jit.movie objects and output in an openGl context (jit.world)