I’m attempting to create a touch interface out of cloth. I’ll have a movie being rear projected onto the cloth and I’d like users to be able to come up and interface with it much like they would a screen.
Does this sound doable with Max/MSP?
My thought is to somehow track the bumps that will happen as people touch the screen and translate that to the mouse X and Y coordinates.
I think you could totally do this. If it were me, I would go with the NUI route of using open source C based vision tracking, then max to easily make interface patches.
You also might want to check out mathieu chamgane’s MMF (http://www.mathieuchamagne.com) which will let you quickly use max patches with multi-touch. I use this method (open source, computer vision tracking, with MMF) all the time and find it very reliable and flexible using my vision based multi-touch screens.