The idea is :
As you can load a video in Live ( exactly as you load an audio file ),
I try to route this video signal in a patch ( wich is loaded in the video’s track ).
The goal is to pass this video signal in video analysis patchs that I own ( real time black and white proportions.. ), for to control a Live API parameter.
I’m doing that with a jit.qt.movie but it would really more efficient to directly capture the video loaded in live ( sync, play/start, travelling at logical points in my project…. ).
Maybe it’s just a basic object, or not… I never use jitter..
Many thanks for your help