I've been reverse engineering a GPU based frame delay module from the subtex.3d.maxpat example and it's working really well. I've been led to believe that all the dimensions of need to be powers of 2 (ie 32, 64, 128).
I've been playing with different resolutions for the xy dimension (texture resolution) and the z dimension (total frames stored) and at certain points Max just crashes (ie z dimension stores 128 frames at xy resolution at 2048).
Is there a reason for this or a way to calculate at what point it would crash? I assumed it had to do with video memory but I can make multiple texture buffers at high resolutions and see no slow down or crash but when I go over an arbitrary value on a single texture buffer it crashes. Is there a resolution limit to individual 3D texture buffers? It looks like it when I open the OpenGL Status window and find the Texture tab (MAX_3D_TEXTURE_SIZE). Is there a way to override that or adapt that. I'm using a pretty hefty video card for my Windows desktop but the max size seems similar to the one on my OSX laptop.
This may not be so much a problem to be solved but curiosity about how video memory and 3D textures work. You can play with patch I made below to get a sense of what I'm talking about.