How to programmatically detect 'ob3d_draw_end popmatrix: GL Error: invalid value
I have a large Max/MSP project using extensive OpenGL-in-Javascript programming. The error 'ob3d_draw_end popmatrix: GL Error: invalid value
occurs on some, but not all machines.
Some details:
I am converting arbitrary texts to Jitter matrices in order to load them into my OpenGL context as textures. This has been necessary because rendering multiple texts into an OpenGL scene using jit.gl.text2d and jit.gl.text3d objects proved to be a significant performance drain. The technique I am using is discussed in detail here.
The technique works fine and has been tested on several modern iMacs, running Lion & Snow Leopard, 4-8GB memory on >2.5GHz Intel Core i5s, using Max/MSP 5.1.9.
I've now run into problems when running on an older MacBook (4.1) running at 2.1GHz with only 2GB memory (OS: 10.5.8, also Max/MSP 5.1.9). Shorter strings of text still render fine, but with longer strings the texture generated appears to be based on a Jitter matrix with all cells set to [1,1,1,1] (ie, a rectangle filled with white).
I suspect this is a memory issue. In any case, the Max window on the MacBook is displaying 'ob3d_draw_end popmatrix: GL Error: invalid value' error messages that I don't see on my higher spec machines.
The program I am developing will have to run on even lower spec machines than the MacBook in question. What I am looking for is a way to detect the error condition from within my Javascript. If I detect that generating a large, high resolution version of a given text fails, I could fall back to using a smaller matrix with lower resolution.
So, is there a way to detect this error condition from inside js code?
I can build a small example patch and js to demonstrate the issue, but the project I am developing is quite large (>8MB of .maxpats, >1MB .js code) and stripping it down will take a while. In the meantime I am hoping that the description above may be sufficiently detailed for one of the JS/Jitter brains reading to provide some comment.
Any ideas?
Thanks,
-- Peter
After further investigation, I was able to narrow down the issue to the extent that it's an OpenGL limitation, not memory.
The MacBook uses an XMA 3100, which has MAX_TEXTURE_SIZE of 2048.
The iMacs use cards with MAX_TEXTURE_SIZE of 8192 and up.
So, is there a way to programmatically determining MAX_TEXTURE_SIZE of the OpenGL context inside Max?
I'm not aware of any way to query this from within javascript, but you can do this easily in C:
GLint texSize;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &texSize);
Thanks for the suggestion!
If I don't hear from someone saying there is already a .mxo/mxe that does this, I suppose I'll have to write one myself next week.
One option: you can use jit.gl.lua for low-level OpenGL programming without having to deal with compiled C objects. The above code would look like:
function draw()
print( gl.Get(GL.MAX_TEXTURE_SIZE) )
end