Somewhere in the last few Jitter updates there was a major behavioral change with jit.window and the interp attribute. My initial experience was that "interp 0" was the default and that enabling "interp 1" resulted in a slight slowing of the framerate, and a smooth interpolation of the matrix to window size. The next version changed the behavior to where there was little framerate difference between interp 0 or 1, but the same difference between interpolated and non-interpolated output. NOW, with interpolation disabled, the framerate is unuseably slow. As an example, Andrews recent 2d version of his swirl patch does not have interp enabled by default. When I initially loaded his patch, it started up at a nearly system disabling 1.036 FPS. Enabling interpolation makes the radical jump up to a stable 23FPS.
So, as it stands, it seems as if something needs to be done to allow "interp 0" to not tank the system. In the past, I've actually preferred the craggy/sharp look of uninterpolated output when the results are going to be fed analog to a projector, losing sharpness in the process. It also occurs to me that by having interpolation defaulted as disabled, new users will be greeted with system slamming poor performance as their default. Increasing framerate by enabling interpolation is counterintuitive; a bit of arcane magic that new users need to be informed of in the documentation, or shielded from by a defaulting the enabled state.
Is my graphics card/particular setup/XP somehow wonky, making this whole thing a personal problem?
Latest Max/Jitter Graphics card drivers.
The example patch of Andrew's is a prime example