I need ideas for my Kinect + Max/MSP project.
Doing a gesture controlled interface in Max/MSP using the Kinect. I need somebody to help me finding an interesting approach to using Kinect + music. This is a bachelor thesis.
Here are some of my ideas:
- make a prototype to show a way of making live performance more interesting by gesture recognition to trigger effects on sounds/music
- make a prototype to show a way of using your body movements/hands to control effects in a sequencer instead of using your mouse or knobs on a synth/mixer/midi module
- more ideas? (KEYWORDS: kinect + sound + max/msp)
bit weird that you start with the tools and then try to come up with the ideas, no?
anyway, here's my performance project in development, using Kinect motion tracking: http://dietervandoren.net/index.php?/project/integration03/
might inspire you one way or another
Hi
dtr's first comment is fundamental to an interaction design project at this level. Putting the Kinect to one side for a moment will allow you to assess exactly what the demands of the environment are. Ask yourself the following questions first:
-what are the expected abilities and input modalities required by the environment? Coarse or fine gestures, or both?
-what are the anticipated sound (media) events?
-how will the gestures be mapped to those events - directly or indirectly? One-to-one or one-to-many?
No doubt your supervisor will have pointed you towards Wanderley's Trends in Gestural Control of Music (IRCAM 2000) and Hunt/Kirk's papers on The Importance of Parameter Mapping (NIME 2002).
I think your first idea of gesture recognition would be the most fruitful - IMHO. Coupled to a live audio input stream this could be quite radical.
Best regards
Brendan
Your way of thinking is totally wrong: You start with tools and develop a way to use them. that is not the right thing to do: YOu should first establish your goals on an artistic level and when you have formed your ideas firmly, you can start looking for tools/ toys. If you do not establish your goal first you will end up losing a lot of time.
@dtr
ps, very cool application of motion/gesture tracking!
Brendan
@n00b_meister: tanx! the video is of the first experimental version. i'm working hard on making it something that can truly be called an instrument...
I don't agree, I think working with a new piece of kit/tech can inspire or push you to new artistic places. What I do find weird though is asking the group for 'ideas'. Maybe a better way to phrase it would be to ask how others have used it so far and use that as a catalyst. Maybe an aven better way is to commit to using it in a way that no one (to your knowledge) has used it for so far...then ask the group for help for code/execution etc.
Bill
also consider the limitation of the tools and environmental conditions.
@dtr: very cool!! loved the use of movement and light in both integration 3 + 4!! Do you have some documentation/blogs where one could get some further insight into your process?
Not at the moment, sorry. There's a snippet in the video on http://www.dietervandoren.net/index.php?/project/integration04/ that gives a tiny glimpse of the Max stuff going on, not more. I might get to work on more documentation in the coming months. A nerdy tech report could be part of it ;)
hey DTR is there anyway i can get this patch?
'this patch' is a modular system of a dozen patches, some python code and a bunch of specific hardware (kinects, sensor gloves, beamers, audio/VGA/DMX interfaces etc). Not likely useful to anyone in its current form. Any element in particular you're interested in?