Using Processing.org as a video source within Max/MSP/Jitter

May 29, 2007 at 3:25pm

Using Processing.org as a video source within Max/MSP/Jitter

I have the dream of combining Processing and Max/MSP/Jitter, so that I may use each for what it is best at! :D

I want to use Processing to create “Sketches”, or “Visual Synthesizers”, but to then host these in the Max environment, using this to control and combine them.

To achieve this, I want to be able to take the visual output from each sketch, and use this as a video source in Jitter. Ideally this would not just be RGB, but also include a fourth Alpha channel, so that I may combine the output of multiple Sketches running in parallel by the use of layering.

Has anyone out there done something similar, or perhaps have any suggestions on how this may be achieved?

I am a software engineer, and though many years have passed since I last wrote anything in Java, I will not hesitate to learn it again, if that is what is necessary to achieve this :)

Thank you!

Ilias B.

#32168
May 29, 2007 at 4:09pm

http://www.cycling74.com/forums/index.php?t=msg&goto=105593
good luck

On 5/29/07, Ilias Bergstrom wrote:
>
>
> I have the dream of combining Processing and Max/MSP/Jitter, so that I may
> use each for what it is best at! :D
>
> I want to use Processing to create “Sketches”, or “Visual Synthesizers”,
> but to then host these in the Max environment, using this to control and
> combine them.
>
> To achieve this, I want to be able to take the visual output from each
> sketch, and use this as a video source in Jitter. Ideally this would not
> just be RGB, but also include a fourth Alpha channel, so that I may combine
> the output of multiple Sketches running in parallel by the use of layering.
>
> Has anyone out there done something similar, or perhaps have any
> suggestions on how this may be achieved?
>
> I am a software engineer, and though many years have passed since I last
> wrote anything in Java, I will not hesitate to learn it again, if that is
> what is necessary to achieve this :)
>
> Thank you!
>
> Ilias B.
>
>

#105336
May 29, 2007 at 4:19pm

Thanks for the link, however I’ve Seen that thread already.

I have found many mentions of loading Processing sketches from within MaxMspJitter, and also of controlling one with the other.

I’ve however found no mention of using the visual output of Processing as a RGBA matrix video source in Jitter, which is the main question of my previous post.

#105337
May 29, 2007 at 4:33pm

u asked for something similar

On 5/29/07, Ilias Bergstrom wrote:
>
>
> Thanks for the link, however I’ve Seen that thread already.
>
> I have found many mentions of loading Processing sketches from within
> MaxMspJitter, and also of controlling one with the other.
>
> I’ve however found no mention of using the visual output of Processing as
> a RGBA matrix video source in Jitter, which is the main question of my
> previous post.
>

#105338
May 29, 2007 at 4:43pm

Hi!

Of course I did not intend to sound offensive in my previous post, seeing it now after having posted it, due to the rush in which it was written a capitalization slipped that could easily be misunderstood…

#105339
May 29, 2007 at 5:04pm

There has been some talk of using Processing inside of MXJ over on the
java-dev list. You might try and follow along with this thread and see
what happens:

http://www.cycling74.com/forums/index.php?t=msg&th=26406&start=0&rid=0&S=b8c6683271172e176d62e7f4defa743c

AB

#105340
May 29, 2007 at 5:07pm

Either jit.desktop for one machine…

Or else run processing on one machine, send analog video out, then
digitize it and send it into a second machine running your patch.

-Matt

On May 29, 2007, at 12:19 PM, Ilias Bergstrom wrote:

> I have found many mentions of loading Processing sketches from
> within MaxMspJitter, and also of controlling one with the other.
>
> I’ve however found no mention of using the visual output of
> Processing as a RGBA matrix video source in Jitter, which is the
> main question of my previous post.


Matthew Lewis
Advanced Computing Center for the Arts and Design (ACCAD)
The Ohio State University

http://accad.osu.edu/~mlewis

#105341
May 29, 2007 at 5:32pm

Ilias Bergstrom skrev:
> I have the dream of combining Processing and Max/MSP/Jitter, so that I may use each for what it is best at! :D
>
> I want to use Processing to create “Sketches”, or “Visual Synthesizers”, but to then host these in the Max environment, using this to control and combine them.
>
> To achieve this, I want to be able to take the visual output from each sketch, and use this as a video source in Jitter. Ideally this would not just be RGB, but also include a fourth Alpha channel, so that I may combine the output of multiple Sketches running in parallel by the use of layering.
I reckon it should be possible to render to a texture either inside or
outside processing, and then apply these textures to videoplanes and
just mix these in Jitter?

Andreas.

#105342
May 29, 2007 at 5:39pm

Thanks!

Jit.Desktop works, but the frame rate is not so great…

As for using the video out, as a last resort it is probably the only working option, but if it all can be done in software it would be much preferable!

Regards,

Ilias B.

#105343
May 29, 2007 at 5:44pm

On May 29, 2007, at 10:04 AM, andrew benson wrote:

> There has been some talk of using Processing inside of MXJ over on the
> java-dev list. You might try and follow along with this thread and
> see
> what happens:
>
> http://www.cycling74.com/forums/index.php?
> t=msg&th=26406&start=0&rid=0&S=b8c6683271172e176d62e7f4defa743c

If you can get access to Processing’s pixel buffer as a BufferedImage
class, there are methods on JitterMatrix to convert the BufferedImage
data to/from a JitterMatrix. See the JavaDoc and jitwebimage.java
example. This will require some knowledge of Java for the wrapping of
the processing sketch, getting the rendering as a BufferedImage (if
possible, which I think it is as long as not using processing’s
OpenGL renderer), and then converting to the Jitter matrix. This is
the strategy to follow all within a single process, which may or may
not be desirable depending on your project.

Alternately, for the interested Java programmer, I believe a
processing module could be written without too much difficulty to
send a buffered image over TCP/IP either as a serialized Java object
or as a JitterMatrix in the jit.net.* protocol which is spec’d out in
the Jitter SDK web pages. This way you could run in a different
process on the same machine or send over the network (watch your
bandwidth, might want to use UYVY or GRGB half chroma data to cut
this in half, though that won’t give you an alpha channel). There are
possibly other protocols supported in Processing already or more
easily cobbled together with existing Java code (serving http JPEG
image stream perhaps?).

http://www.cycling74.com/twiki/bin/view/ProductDocumentation/
JitterSdkNetworkingSpec

Best of luck, and if anyone makes an example of this sort of
communication, please post to the forum. I think there’s a few people
looking to solve this problem in ways other than jit.desktop/digital-
>analog->digital techniques (although those might work for your
particular project).

-Joshua

#105344
May 29, 2007 at 7:18pm

Accept youd have to share memory between processes on the video card
and operating systems, afaik, really dont want to let you do that.

On May 29, 2007, at 1:32 PM, Andreas Wetterberg wrote:

> Ilias Bergstrom skrev:
>> I have the dream of combining Processing and Max/MSP/Jitter, so
>> that I may use each for what it is best at! :D
>>
>> I want to use Processing to create “Sketches”, or “Visual
>> Synthesizers”, but to then host these in the Max environment,
>> using this to control and combine them.
>> To achieve this, I want to be able to take the visual output from
>> each sketch, and use this as a video source in Jitter. Ideally
>> this would not just be RGB, but also include a fourth Alpha
>> channel, so that I may combine the output of multiple Sketches
>> running in parallel by the use of layering.
> I reckon it should be possible to render to a texture either inside
> or outside processing, and then apply these textures to videoplanes
> and just mix these in Jitter?
>
> Andreas.

v a d e //

http://www.vade.info
abstrakt.vade.info

#105345
May 29, 2007 at 9:30pm

if I understood this thread correctly, the only memory which
operating systems (or mac os) do allow to share between processes (to
some extent) is the screen buffer (or whatever the correct term is).
Reminds me of how in the very very very early computer days they used
cathode ray tubes as computer memory, because of the afterglow. I
always found that a fascinating idea, and here it returns..

ciao,

Joost.

On May 29, 2007, at 9:18 , vade wrote:

> Accept youd have to share memory between processes on the video
> card and operating systems, afaik, really dont want to let you do
> that.
>
>
> On May 29, 2007, at 1:32 PM, Andreas Wetterberg wrote:
>
>> Ilias Bergstrom skrev:
>>> I have the dream of combining Processing and Max/MSP/Jitter, so
>>> that I may use each for what it is best at! :D
>>>
>>> I want to use Processing to create “Sketches”, or “Visual
>>> Synthesizers”, but to then host these in the Max environment,
>>> using this to control and combine them.
>>> To achieve this, I want to be able to take the visual output from
>>> each sketch, and use this as a video source in Jitter. Ideally
>>> this would not just be RGB, but also include a fourth Alpha
>>> channel, so that I may combine the output of multiple Sketches
>>> running in parallel by the use of layering.
>> I reckon it should be possible to render to a texture either
>> inside or outside processing, and then apply these textures to
>> videoplanes and just mix these in Jitter?
>>
>> Andreas.
>
> v a d e //
>
> http://www.vade.info
> abstrakt.vade.info
>
>
>

#105346
May 29, 2007 at 10:34pm

thanks joost, i did not know that.

http://ed-thelen.org/comp-hist/SEAC-Williams-tube-desc.html

is that the one?

On 5/29/07, Joost Rekveld

wrote:
>
> if I understood this thread correctly, the only memory which
> operating systems (or mac os) do allow to share between processes (to
> some extent) is the screen buffer (or whatever the correct term is).
> Reminds me of how in the very very very early computer days they used
> cathode ray tubes as computer memory, because of the afterglow. I
> always found that a fascinating idea, and here it returns..
>
> ciao,
>
> Joost.
>
>
> On May 29, 2007, at 9:18 , vade wrote:
>
> > Accept youd have to share memory between processes on the video
> > card and operating systems, afaik, really dont want to let you do
> > that.
> >
> >
> > On May 29, 2007, at 1:32 PM, Andreas Wetterberg wrote:
> >
> >> Ilias Bergstrom skrev:
> >>> I have the dream of combining Processing and Max/MSP/Jitter, so
> >>> that I may use each for what it is best at! :D
> >>>
> >>> I want to use Processing to create “Sketches”, or “Visual
> >>> Synthesizers”, but to then host these in the Max environment,
> >>> using this to control and combine them.
> >>> To achieve this, I want to be able to take the visual output from
> >>> each sketch, and use this as a video source in Jitter. Ideally
> >>> this would not just be RGB, but also include a fourth Alpha
> >>> channel, so that I may combine the output of multiple Sketches
> >>> running in parallel by the use of layering.
> >> I reckon it should be possible to render to a texture either
> >> inside or outside processing, and then apply these textures to
> >> videoplanes and just mix these in Jitter?
> >>
> >> Andreas.
> >
> > v a d e //
> >
> > http://www.vade.info
> > abstrakt.vade.info
> >
> >
> >
>
>

#105347

You must be logged in to reply to this topic.