I'm trying to develop an external to make a jit.window able to receive touch events from the window7 system, in order to use multitouch screens which are win7 compatible within jitter. I've tried the TUIO WM_TOUCH solution ( http://nuigroup.com/forums/viewthread/4087/
), but this solution (which is a sort of hack) is not working for fullscreen application.
I've used some code from the win32 MTScratchpadWMTouch sample from the windows7 sdk.
The touch messages are only sent to the windows that are configured for, using the function RegisterTouchWindow(hWnd, 0). hWnd is the handler of a particular window of the application.
To handle the touch messages, I need to access to the handler of the window (in my case a jit.window), but I can't access to this handler using the current maxMsp or jitter sdk. Then, touch messages are sent trough the WndProc CALLBACK associated to this window (WM_TOUCH message); I don't have access to this callback function too.
Is there a way to have an access to these elements?
I think the best solution would be to implement these touch features directly into the jit.window object : for example, sending a message (like ("touch_enable 1") to the jit.window object to receive touch event for this window, and then output touch events though the dumpout outlet (similarly to mouse messages).
Adding this feature seems easy (only a few lines of code to add to the jit.window object) and will be a very simple way to develop multitouch application with jitter.
So, this question is for Cycling'74 people, will it be possible to have this feature soon?
All the best,