Hello, I've been using Max MSP since before Christmas as part of a course I've been doing but haven't really had much help for it. I succesfully developed a sampler which is triggered by a joypad via the "Hi" object. I am now working on a project (which needs to be done pretty soon!) for which I would like to be able to trigger samples and changes to samples by the feed that comes in from a live webcam.
I have found many articles in which this is mentioned but can't seem to find the solution to what I need. I have managed to get my webcam working fine on my iBook. I already have a sampler I can try to trigger (I can remove the Hi object input and tailor it to fit the outputs from the webcam).
From what I have read, it would appear that I need to use jit.qt.grab or similar (I am on Macintosh OS), with a type of grid over the top of it in order to split the areas in which movement is ocurring. I.e: if movement in top left, trigger sample 1. If triggered in top right, add granular synthesis, if triggered bottom left, play backwards etc.
I have been reading tutorials and help files left right and centre, have managed to use an object to get a pre-recorded quicktime file to play in Max MSP window, but not to get the video feed that i can see elsewhere on my iBook to play in the Max Patcher.
If you could help me with this it would be greatly appreciated although I understand that you all have lives outside of the forum! I just would like to know how to a) get the webcam feed to play inside the patcher and b) get a grid over it in order to receive data when the webcam's subject moves.
Many thanks to anyone who can help, it's driving me crazy!