Forums > Jitter

Soundflower for VIDEO, er, videoflower?


Oct 02 2006 | 7:52 pm

Ableton Live 6 just released with support for triggering quicktime video samples. You can even send the video to a second monitor in fullscreen mode.

This leads me to the question: Is there any way to route this video instead to a jitter matrix? thus being able to process the video being triggered in Ableton?

to me that would be oh so cool for my VJ rig as I incorporate audio samping in my performance.

It would seem to me that there would be a way similar to soundflower where you "trick" the computer in to thinking it’s outputting to a video card, then some sort of quicktime grab could import it in to a jitter matrix?

But maybe I’m missing something fundamental here? Is it too much to ask of a computer?

Maybe someone’s done this already?

cheers,

Oct 02 2006 | 11:07 pm

This is not going to happen.

What might be effective would be to create a Pluggo plugin which could be triggered in any way you like to send information via OSC or whatever to a MaxMSPJitter patch.

-A

Oct 02 2006 | 11:52 pm

interesting. I’ll have to learn more about Pluggo.

thanks!

Oct 02 2006 | 11:57 pm

why isnt it gonna happen?

Is it impossible?

or is it something that cycling 74 just isnt interested in?

Joe

On 10/2/06, Andrew Pask wrote:
>
>
>
> This is not going to happen.
>
> What might be effective would be to create a Pluggo plugin which could be
> triggered in any way you like to send information via OSC or whatever to a
> MaxMSPJitter patch.
>
> -A
>

Oct 03 2006 | 12:11 am

The person who wrote Live’s video implementation said to me that in his view it was impossible.

I haven’t played around with the controls and what have you that Ableton provided for this stuff. Is there any of it you couldn’t do in a plugin?

-A

Oct 03 2006 | 3:14 am

well, I guess it turns out that the video implementation in Live is only in Arrangement mode. I was hoping it might be in Session mode too so I could trigger the video clips on the fly.

I guess I need to get building that video sampler in Jitter now… ;-)

thanks again for the quick feedback.

cheers,

Oct 03 2006 | 4:14 am

just a quick note, i am having a situation in wich i need to split 1 video
captures into jitter and others.
after seeing dvdriver and splitcam fail on the jitter front i found
webcampsplitter does the job beautifully with almost no cpu penalty. used it
with the 1394 unibrain. recommended

there was also the freeframe plug patchbox, but it is discontinued and i
never got it to work.

Oct 08 2006 | 11:38 pm

It’s not impossible.

See Snapz Pro for the Mac. It does realtime 30fps screengrab to a quicktime movie.

That means the screen data is accessible and could be spit out a variety of ways.

Not that *I’m* gonna code it, but it could be done.
And I’m also not saying it’d be easy :)

Oct 09 2006 | 7:10 am

Naturally you can do a screen grab – see jit.desktop if you want
something like that. But that’s not what SoundFlower, nor the tool
that you want, does.

jb


ian
Oct 10 2006 | 12:36 am

Complete guess, but I bet something could be done with something
similar to virtualization. Run an OS under some virtualization, then
instead of showing the screen, piping that video to a quicktime
stream or something of the sort.

Although Snapz doesn’t do anything as elaborate, if it’s fast enough
for you, there could be some way to pipe that data into your app as
well. You might talk to the Snapz people to see i they would
consider making a version of their software that can stream that data
instead of creating a file.

Both of these methods would likely be very hard. But then, I’m not a
programmer.


cw
Oct 10 2006 | 1:16 am

oooorrr… you could do all of your video within jitter and just use live to control it.

here is one method: create a buffer with a single cycle from a phasor object. export the buffer to AIFF, use as a loop within live’s session mode, and pipe the audio to max with soundflower. assign midi to the clip launch of the session clip, and turn on midi mirroring (there are instructions about this in the ableton forums), which will output the midi note used to launch the clip upon launch. this midi note can be used to differentiate video clips in jitter, and the phasor output can be used to sync frames. also, this allows for mangling your jitter clips with lives wonderful warp markers.

nifty huh?

Oct 10 2006 | 2:00 am

the problem with this approach is that all video that the buffer gets
is from the output display of the app.

this breaks the utility of being able to send video to any and from
any portion of any render chain, which is what id like to be able to do

Plus, snapz pro does some nifty openGL texture/screen scraping and
buffering, but it utilizes a decent amount of CPU, and does heavy
post processing when saving out a movie.

youd need to have applications specifically coded to use some sort of
video loopback device id imagine this isnt as easy as it sounds.

v a d e //

http://www.vade.info
abstrakt.vade.info

Oct 10 2006 | 2:10 am

wow. that is most definitely nifty! I’ll have to work my way through what you just described. Thanks very much for the clever workaround.

Viewing 13 posts - 1 through 13 (of 13 total)

Forums > Jitter