Syphon for Jitter available.


    Nov 03 2010 | 9:59 pm
    Hi.
    Im really pleased to announce the first public beta of Syphon, which lets various audio/video environments share video in realtime. We have implementations for Jitter, Quartz Composer, FreeFrameGL and Unity Pro, and know of some exciting announcements from commercial app developers who are integrating new products and designing products around Syphon.
    Syphon for Jitter will allow you to push matrices and jit.gl.textures to any supported Syphon application, or receive textures from any Syphon server on your system. This means Jitter Unity, Quartz Composer hosts, FFGL hosts, and other apps and implementations as they become available.
    Think of it as a send / receive pair across applications, for video, or 'soundflower' for video if you like.
    Just to give an idea of performance, I am able to share 4K frames (jit.noise 1 4096 2048) to Quartz Composer and test applications (simultaneously) at over 40fps, on the GPU, on my laptop.
    Thanks.

    • Nov 03 2010 | 10:09 pm
      Thanks for making this available to everyone Anton.
      I've been fortunate enough to test this with Jitter and Quartz Composer; it does exactly what it's supposed to (and very well!)
      Three cheers for the Syphon team!
    • Nov 04 2010 | 12:54 am
      Thank you.. it will allow so much new possibilities It makes my day.
    • Nov 04 2010 | 8:29 pm
      Really great!! thanx a lot vade and Tom.
    • Nov 06 2010 | 4:19 am
      Very exciting! Looking forward to checking this out.
    • Nov 08 2010 | 5:50 pm
      noway
      thanks!!!
    • Nov 08 2010 | 9:06 pm
      We just got a nice OpenFrameworks bridge working too, and Cinder is on the way. :) Should be fun.
    • Nov 11 2010 | 12:19 pm
      thanx vade and syphon team for this great work !
      i have one question and one inspiration
      the question: is it possible to share an gl output between two seperate gpu's on a mac pro.
      the inspiration: it would be realy great, if someone can write an virtual quicktime videoinput component, which uses the syphon server content.
      so, we could use every applications which allows us to use a quicktime videoinput with one extention. (like modul8, Resolume Avenue, grandVj ... )
      best, JATom
    • Nov 11 2010 | 7:22 pm
      Modul8 works for receiving via Quartz Composer if you use KinemeCore 'safe' loading hack. Search the Garagecube forums. Resolume Avenue works now with the FFGL plugin. Grand VJ works now with the Quartz Composer plugin last I checked.
      Multiple GPU should work, IOSurface (the api underneath Syphon) should handle it auto-magically, assuming both GPUs are active (have displays enabled, known as "online"). Otherwise both host applications need to have special pixel format specifications to use 'offline renders'. Have you tried it?
      As far as a QT Component, I've looked into it, its messy, and I *suspect* things will be changed in 10.7 anyway. Im more than happy to let someone else tackle QT 7 video digitizers. Also, the digitizer would circumvent the GPU, thus rendering the speed advantages of Syphon mostly moot.
    • Nov 17 2010 | 4:58 pm
      Fantastik news !
      Thanks Vade !
      marie-Hélène
    • Nov 18 2010 | 12:25 am
      Just what I needed!
      Thank you, Vade!
    • Aug 30 2012 | 4:38 pm
      Hey folks, I noticed that this thread is a little old. I am exploring the possibility of using cinder and jitter together. Does syphon support cinder? Is there a better was to use cinder in max (i.e. cinder external) ?
    • Aug 30 2012 | 9:44 pm
      by the way: are there plans or any concepts for windows?
    • Oct 30 2015 | 7:11 am
      where do I download the jitter stuff for Syphon?
    • Oct 30 2015 | 7:13 am
    • Mar 23 2016 | 9:04 am
      Hello, please could you tell me how I can change the Syphon Resolution ?
      I explain : I want to use a jit.window to have my "feedback video" (to check everything's ok) at 800x600, but I want fullscreen resolution for my second screen/video-projector ; or just change his resolution to 1920x1080. How can it be ?
      Thank you very much.
    • Mar 23 2016 | 4:36 pm
      capture using jit.gl.node @capture 1 @adapt 0 @dim 1920 1080 (or the highest resolution you want to display). for the lower resolution simply send this through a jit.gl.texture @adapt 0 @dim 800 600 before sending to syphon .
    • Mar 23 2016 | 6:10 pm
      Thank you for teaching me the jit.gl.node.
      But what I wanted to do is getting everything calcuted on HD, but that the jit.window is displayed on my first screen computer in a smaller "box" (everytime i stretched the box, it resize the output of the syphon, of course).
      My problem is : I use jit.window on my computer to see what I do, and I send it by Syphon to get in Madmapper for output (second screen/videoprojector).
      BUT when I use 1920x1080 (or sometimes 2400x600), I got a huge jit.window, and I can't see all what I do (of course). So I wish have a smaller box on my first screen, but without changing the HD resolution.
      Hum. Hum. :-D
      Anyway, thanks for the quickly reply.
      My best,
      Hieros.
    • Jan 23 2017 | 3:44 pm
      Hi guys!
      I'm not good at this Max Msp Jitter stuff. I've been trying to figure this out and I simply can't. I hope you wizards can help me.
      I would like to record the final jit.gl.render output into a mov. file with sound through syphon record. How do I do this?! I need someone to kindly explain to me this task step by step, because I literally have half a brain.
      I have several questions, so here we go:
      I have syphon server/client/record ready to go. I've been successful at connecting the jit.gl.node 2nd outlet to a mesh in other examples and see it in the syphon record. But this time, I want to record the whole output out of jit.gl render. I am attaching a patch made by Sem Schreuder. I have attached the syphon object but is not connected because I've tried all the options my brain could think of.
      My questions are:
      1. How do I get a mov. file with sound?
      2. Do I need a jit.gl.slab to flip the image? I do not know java and do not know how to code.
      3. How do I record a mov. file from jit.gl.render. As far as I understand, jit.gl.node 2nd outlet only works with 3D objects. But what if the output I want to record from is not a 3D object. This patch is complex (for me) and if I connect the jit.gl.node 2nd outlet to the jit.gl.gridshape it skips the jit.gl.texture and I am sure it skips a few other objects.
      This question is not for this patch but it happens with syphon
      4. Sometimes, when I connect the jit.gl.node 2nd outlet to jit.gl.mesh I lose the output of jit.window, but I can see it in the syphon output. Why?
      I'm sorry, I know these are a lot of questions but otherwise I won't learn.
      Thank you very much!
    • May 11 2017 | 3:01 pm
      i have some of these and other questions about syphon, as well: output disappears at times + in general, adding syphon to my patch has lowered the resolution/ creates a jittery image/ more jagged lines etc (also with high_res dim etc)