I am really confused. Can you explain this to me.
My project is to do a fully functional body interactive gesture recognition instrument that triggers music and sounds. All I know is that there are different techniques to connect this.
One is libfreenect, and another is OpenNI? Whats the difference?
And Synapse?
And there are some more I guess?
Before I start with my project I need to know all ways of doing all this so I can evaluate the most optimal way for me before I start writing stuff in max msp/c++/java/whatever.
Thanks and sorry for all newbie questions.