Hi! I have a few raspberry pi computers with camera boards and I would like to stream the video feeds into Jitter. I can get video into Jitter by streaming the signals via rtsp but there is significant latency and the image quality is poor. Outside of jitter I can successfully stream video from a raspberry pi using nc piped to Mplayer and it looks great without much latency.
nc -l 5001 | mplayer -fps 31 -cache 1024 –
I also have tried the vipr external but i believe its only for encoding to various codecs or receiving jitter matrixes from another computer running max but I’m not positive. Maybe there is a tweak i need to do to get it to see the raw stream from nc? Anyway, my question is is there a way to get a raw video feed from nc into Jitter? I think Mplayer is just using the ffmpeg libraries to display the video. is there a set of tools for jitter that could do the same thing? or is there a raw decoder for jitter that can take a raw TCP video feed and display it. i even tried udp receive but i believe its only for messaging and not video. i would like to avoid using rtsp and quicktime due to the latency and video quality.
Anyway,I’m really hoping there is something simple I have missed! Any help or ideas is appreciated.
i’ve tried the vipr external but i could not get it to accept the video stream that was being sent from the Raspberry pi via NC. I’m not sure if vipr designed for ‘raw’ video streams or if the decoder only works for streams via jit.net.send. The raspberry pi just sends ‘raw’ h264 video via NC using its built-in camera program . . command on the Pi is:
raspivid -t 0 -o – | nc 10.0.1.4 5001
i just need something in jitter that can accept the incoming stream on port 5001. Mplayer can accept and display the stream so maybe there is a way to locally pipe the video from MPlayer into Jitter without extra latency or re-encoding? agh! so close