JMP's jit.freenect and jit.qt.grab mutually exclusive?

Nadav Assor's icon

Hi all, first I want to join in applauding Jean-Mark for the great freenect object, came just in time for me!
I'm trying to use the Kinect in combination with a regular dv camera hooked through firewire, or any other camera such as the onboard MBP isight for that matter, and so far what I get is that either jit.qt.grab or jit.freenect are able to get a color camera image, but not both at the same time. Grabbing of the depth map matrix doesn't seem to be affected by this. Sorry if this has already been posted about, I looked and couldn't find any mention of this.
Any ideas? Has anyone else had this problem, or done this successfully?

UPDATE: I managed to get the dv input + the kinect video input by using QC to capture the video, then Syphon (thanks, Vade!) to send it into Jitter, capturing the kinect through freenect in Jitter. Still, I'd love to hear if there's a way to do both in Jitter or if it's just a problem I'm having.

My specs:MBP 2.66ghz core i7, 4gb ram, osx 10.6.4, QT v 10 (114), Max v 5.1.6
Thanks!
Nadav

sterlingcrispin@gmail.com's icon

jit.qt.grab should be able to grab from many different camera sources, open up the help file you can get a device list and pick

also jit.freenect.grab can grab from multiple kinects, and use the depth + color data from each

I'm not sure what problem you are having, maybe if you elaborate a little more

Nadav Assor's icon

Thanks for replying, I guess I wasn't clear enough in my description though- I'm aware of the multiple devices available (I've been working with Jitter for quite a few years... :), but sadly that doesn't address my problem.

To recap the problem I'm having:
- I'm not able to actually get both jit.qt.grab and jit.freenect.grab to deliver a live color camera image at the same time (in the freenect object I'm referring to the regular camera output it provides from the middle outlet, the depth matrix is always being output from the left outlet without a problem and works in conjunction with jit.qt.grab): when both are banged and sent the open message, only one outputs an actual image, the other (the second one opened) delivers black.
- This happens in even the simplest scenario possible, with my MBP hooked up to one kinect and using either the onboard isight camera, or a standard dv camera hooked in with firewire.

The replies I'm looking for:
- Someone who can confirm they've successfully used both objects in conjunction in jitter, getting live color input from both the kinect camera and another camera (this will help confirm it's a local problem I'm having)
- Or, someone who can confirm they're having the same problem (meaning it's probably a driver clash problem between freenect and jit.qt.grab?)
- Or, Jean-Marc's input on this...

As a note, as I mentioned in the update to the first post, I'm not having problems with capturing the kinect through one application (for example Jitter) and the other camera through another, (for example Quartz Composer) - which leads me to suspect of some inter-Jitter trouble with this.
Any ideas welcome, happy holidays...
n