How video stream is represented in mxj (java)
Hi,
I want to develop a max external which takes in a live video from jitter and broadcast it using JMF. Can some one guide me how a live video (video being played) can be referred as a DataSource for JMF?
First of all, you could of course try using the broadcast object from Max itself, but that's not your question.
Video is represented as a stream of single jit.matrix objects. Your java external would have to convert the incoming jit.matrix object into whatever format (BufferedImage?) JMF needs one every frame. jit.matrix has a few methods like copyArrayToVectorPlanar for fast copying of array data.
Note that uncompressed video data can become very heavy streams easily.
Thanx Florito, Actually jit.Broadcast or jit.qt.broadcast does not stream Audio and video together, it only streams video. Where I want to stream video with audio, thats wot I am trying to achieve. do you have some idea how it can b achieved?