-->
Anyway it's a fun little project aided very much by the keystrokes mxj, so thanks for that. The next thing i'll probably add is some audio processing of the game sounds, and perhaps a slightly more structured and subtle use of the game parameter changes.
Interesting idea about getting jitter read the screen and perform some sort of auto aiming. Another idea I had was to read game information by getting jitter to 'see' what is happening, or perhaps use some voice recognition functions to 'hear' if a monster makes a sound and use that to trigger effects.
-->
Wondering about the audio processing, have you been able to get the audio into Max directly? This has been an issue for me in the past, where I had to pipe the audio out physically and then route it back in to the adc~ (with a full-duplex sound card only), since with the game or other app running I couldn't seem to get Max to hear it while it was playing. Probably am missing something, I hear about Soundflower on the Mac which apparently can do this, but for Windows?
Also the auto-aim is just a thought. It might be straightforward, or not, but my thought was that assuming you're standing still, you trigger the analysis, and any spot there is a difference between frames the mouse would move towards that (using a kind of successive analysis, but choosing only one area). You'd have to make the analysis, decide where the view would move to, then turn the analysis off briefly (so you don't get the overall scene motion being analyzed too), move a bit, run it again, etc. Probably depending on how far the motion is from the center, your mouse would move ("be moved") a specified amount and in that direction. All this could happen really fast, plus you could maybe have an optional Fire command once you're there. Not at all sure what the results would be though.
It's interesting to think about and would be cool to see it in action. However I like the idea of the psychedelic processing/controlling vid parameters etc. as well as sounds in real-time better, and is possibly easier. With a more customizable game this would be even more so... anyone know of a game with tons of changeable parameters that could be mapped to keystrokes? I was thinking Second Life, but there's a lot of layers there and it's not as much a "up-to-the-second" kind of environment as a shooter or other game seems to be. Some kind of Quake/Unreal Explorer which has that all built in, and ideally without needing to learn too many console commands, though this may be the only way to get to them programmatically.
Interesting about having the MIDI controller too, that would be useful, since I originally had a keystroke to trigger the events at all... but once the game is the active app, those keystrokes go to the game, and my "trigger" key no longer will shut it off. Maybe a special function key would work---as it is, I need to switch apps to Max to stop it manually. So having the midiin or ctlin control of that would bypass that issue (as I don't imagine the game picks this up by default, or at all).
This is a cool thread and the mxj makes all these new ideas possible. Maybe we need to just make our own game in Jitter :)