question about oscsend and oscreceive

Oct 18, 2011 at 10:08am

question about oscsend and oscreceive

I am trying to connect Kinect with Synapse. I download oscroute external and Synapse.
Currently, I can see the skeloton in Synapse window, and I can see the video from jit.synapse.
I also donwload “osc-route” and all the other externals from CNMAT website.

I try to follow the instruction on the website:

http://synapsekinect.tumblr.com/post/6307752257/maxmsp-jitter

But I can’t really understand how to get the message from Kinect.
There are no externals called “oscsend” or “oscreceive” mentioned on Synapse website.
I searched cycling74 forum but found no example patches.

Could anyone share an example patch that connect Synapse, Kinect and Max/MSP ?
Please help. Thank you so much.

#59396
Oct 18, 2011 at 10:43am

Hi,

did you try to replace [oscsend] and [oscreceive] with [udpsend] and [udpreceive]? As far as I know, they just do the same.

HTH,
Ádám

#213677
Oct 18, 2011 at 11:06am

Try this:

– Pasted Max Patch, click to expand. –
#213678
Oct 18, 2011 at 1:03pm

If you are windows, save some time and pain and use my jit.openni external to get Kinect data. Everything: skeletons, rgb images in jitter, ir images in jitter, userpixel maps, etc.

Full install docs, wiki documentation, etc at https://github.com/diablodale/jit.openni

Synapse and OSCeleton are fine but add additional failure points and middleware.

If you want to use the older tech of those two, then you will need to install their needed drivers, run their needed external EXE/app and then use udpsend and udpreceive to get the data from their EXE/app. And if you want any image/ir data you can’t with OSCeleton. And Synapse requires another DLL.

Its kinda funny to me how people keep using those apps on windows for max/msp/jitter solutions. Oh well, their choice for unneeded pain. Below is example:

– Pasted Max Patch, click to expand. –
#213679
Oct 18, 2011 at 1:19pm

Ahh, if only someone would port it to OSX…

#213680
Oct 18, 2011 at 2:05pm

Thank you all.
I am using Mac, will try jit.openni later.

Currently, I can only get simple message e.g torso up, leftelbow down, etc. from “udpreceive 12345″, but I don’t get any numbers for X, Y, Z. Do I missing something ? Do I need to do extra setting to get coordinates ?

Please help. Thank you so much.

#213681
Oct 18, 2011 at 8:35pm

From Synapse page you linked:
“You must tell Synapse every 2-3 seconds that you want to track a joint position to keep it spewing joint positions out.” and there is an example patch. Are you doing this?

Here’s the rest of my patch, but I stopped using Synapse and went for OSCeleton instead and never really finished so may be some rough edges!

– Pasted Max Patch, click to expand. –
#213682
Oct 19, 2011 at 2:32am

Hi, Woodbury,

Thank you so much. I use your patch and it works. The patch structure is inspiring.
Could you also let me know why you went for OSCeleton ?
It seems that Synapse is easier to install.

#213683
Oct 19, 2011 at 7:36am

Synapse is great, but I wanted to be able to track multiple users which it can’t do.

There is an easy way to install OSCeleton by using Zigfu to do all the Primesense OpenNI stuff first. For the record there are problems with NITE on the Mac front, in that I receive segmentation faults when users exit and re-enter the Kinect boundaries so I am going to have to use Windows.

Also, the MS Kinect SDK (instead of Primesense one) which is not available on Mac, does not require the user to do the PSI pose for calibration which could be very useful for my disabled users, though I am told it has a shorter range. I would guess diablodale’s external uses the Primesense library as it is called jit.openni.

#213684
Oct 19, 2011 at 11:05pm

jit.openni uses the OpenNI APIs; which so far have only been used/tested with the PrimeSense libraries.

There is a “shim” that someone wrote to allow usage of the Microsoft SDK with the OpenNI framework. It didn’t work at all with the Microsoft SDK refresh and the community asked for him to update it. I do not know if that happened. I couldn’t wait so…

I have already written a Max external that directly uses the Microsoft SDK. I used it privately for a recent art exhibit of mine. It only has support for skeletons at the moment but I will be adding rgb and depthmaps soon for an upcoming exhibition of mine. Once those are added, I’ll release it. I suspect that will be late November.

#213685

You must be logged in to reply to this topic.