Forums > Dev

kinect fusion on the way

March 8, 2013 | 2:32 pm

It’s really amazing what they accomplished with kinect fusion. I am looking forward to see that talking to max6.


April 21, 2013 | 12:51 pm

Since you brought up the topic…I have considered and looked into a private enhancement of my dp.kinect object which did digest the Fusion data in the new Kineck SDK 1.7. That written…I don’t yet have a practical application for it yet so its functionality I haven’t further developed.

The Kinect SDK has a demo app which does a good job of letting you scan an environment and then can save that data out in a file format which can eventually be loaded into a jit.gl.model. That work is already done. And it works ok.

What functionality could you imagine if there was a Max external that directly consumed/output Fusion data?


April 26, 2013 | 1:14 pm

Dear Diablodale, I haven’t thought yet in a clear application for Fusion in Max. But I thought it would be great, if possible, to have a 3d environment where I could load instantaneously "objects" from the real space thru Fusion. During the process of uploading them, I imagine one could experiment with several mappings between the datas from Fusion and sound, video, etc. As long as I teach at a Architecture school and I work with experiments between Architecture and "new media" (virtual reality, augmented reality, interactive objects/environments, performative architecture, etc), this possibility I just mentioned sounds interesting. But I must confess I don’t really know much about the Fusion’s processes. My ideas are more like speculative ones…


Viewing 3 posts - 1 through 3 (of 3 total)