Getting streaming video from the internet into Jitter


    Apr 22 2011 | 9:51 pm
    I would like to open/read video streams from the internet and put them into Jitter.
    I was planning to use Python or Java to accomplish this purpose, however I'm having issues reading the initial streams, let alone converting them into Max lists/Jitter matrices.
    Does anyone have any ideas as to libraries for Jitter or for Python/Java which can accomplish this purpose?

    • Apr 23 2011 | 4:24 pm
      The easiest might be to use rtsp stream and open them with [jit.qt.movie]. With [jit.desktop] you can make screen-shots of parts of the monitor and convert them into jitter matrices.
      Jan
    • Apr 26 2011 | 7:49 pm
      This method won't quite work for me since I'm planning to use YouTube, which doesn't use rtsp, and I don't want to have other windows open for [jit.desktop].
      Right now I'm planning to use the Python bindings for VLC Media Player (libvlc) to open YouTube streams, which I plan to then render to a jit.gl.render object.
      If anyone thinks this will or will not work, please let me know!!??
    • Feb 10 2012 | 3:32 am
      Hey there,
      I'm just starting to look into streaming from YouTube to max myself. Did this method work for you in the end?
      Any other tips?
      Thanks
      Si.
    • Feb 10 2012 | 1:46 pm
      I did this by downloading the videos from python in Max using clive and then transcoding using ffmpeg. Whole thing was just a python script inside the py object. Didn't stream mind, just dl transcode and play. Might work for you.
    • Sep 05 2013 | 10:33 am
      any updates on this from anyone?
    • Sep 08 2013 | 11:01 am
      an alternative would be using another computer for streaming and sending the output to capture card within (or attached via USB to) your computer, where you can use jit.qt.grab.
    • Sep 08 2013 | 11:27 am
      or if you find a way to teach VLC to pass its video texture on to syphon (if you are on OSX) then you are set.
      or https://github.com/caprica/vlcj which is a java wrapper for libvlc and somehow (I havent looked closely enough at the API) catch the actuall frame and send it via syphon (http://code.google.com/p/syphon-implementations/source/browse/#svn%2Ftrunk%2FSyphon%20Implementations%2FJSyphon) to your max patch. The tricky bit is the capture of the vlc-stream-frame.
    • Mar 02 2015 | 4:45 am
      Anyone get this to work with YouTube? I'm trying to capture multiple live YouTube events to jitter then send out via syphon to other apps.