Getting streaming video from the internet into Jitter
I would like to open/read video streams from the internet and put them into Jitter.
I was planning to use Python or Java to accomplish this purpose, however I’m having issues reading the initial streams, let alone converting them into Max lists/Jitter matrices.
Does anyone have any ideas as to libraries for Jitter or for Python/Java which can accomplish this purpose?
The easiest might be to use rtsp stream and open them with [jit.qt.movie]. With [jit.desktop] you can make screen-shots of parts of the monitor and convert them into jitter matrices.
This method won’t quite work for me since I’m planning to use YouTube, which doesn’t use rtsp, and I don’t want to have other windows open for [jit.desktop].
Right now I’m planning to use the Python bindings for VLC Media Player (libvlc) to open YouTube streams, which I plan to then render to a jit.gl.render object.
If anyone thinks this will or will not work, please let me know!!??
I’m just starting to look into streaming from YouTube to max myself. Did this method work for you in the end?
Any other tips?
I did this by downloading the videos from python in Max using clive and then transcoding using ffmpeg. Whole thing was just a python script inside the py object. Didn’t stream mind, just dl transcode and play. Might work for you.
any updates on this from anyone?
an alternative would be using another computer for streaming and sending the output to capture card within (or attached via USB to) your computer, where you can use jit.qt.grab.
or if you find a way to teach VLC to pass its video texture on to syphon (if you are on OSX) then you are set.
or https://github.com/caprica/vlcj which is a java wrapper for libvlc and somehow (I havent looked closely enough at the API) catch the actuall frame and send it via syphon (http://code.google.com/p/syphon-implementations/source/browse/#svn%2Ftrunk%2FSyphon%20Implementations%2FJSyphon) to your max patch. The tricky bit is the capture of the vlc-stream-frame.