max texture size

ingo's icon

Hi.

Is there a way to obtain the maximum texture size the graphic card is capable to manage via jitter?

i searched the documentation, but couldnt find anything, so i tried a simple code i found on the web in an external, but max crashes... i guess jitter does not like things like this? how to make such gl-calls?

here is the code i tried:

void maxgltexsize_bang(t_maxgltexsize *x)
{
    int argc;
    char **argv;

    GLint tex_size;
    argc=1; //i suppose this is 1?
    argv = NULL; //i suppose this is NULL? or the path to max itself??

    glutInit(&argc, argv);
// do i need to initialize GLUT? should be initialized when starting Jitter?

//    glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGB|GLUT_DEPTH);
//    glutCreateWindow("");
//do i need to create an gl-context?
// i tried with and without this 2 lines... also creating a jit.window...

    glGetIntegerv(GL_MAX_TEXTURE_SIZE, &tex_size);
    post("GL_MAX_TEXTURE_SIZE %in", (int)tex_size);
}

crash-report shows me, that max crashes when calling glGetIntegerv

thanks for advice...
io

Wesley Smith's icon

You have to have a valid context for any of this code to work.
Depending on your platform, that would be aglGetCurrentContext or
wglGetCurrentContext. Also, don't use GLUT inside Jitter. GLUT is
for standalone applications and does not play nice with others.

You can use jit.gl.sketch with the message "glget max_texture_size" to
query this kind of stuff. Basically, take any glGet call and lop off
the GL_ of the GL enum and it should work except perhaps for for
really obscure ones.

wes

On Tue, Aug 5, 2008 at 11:09 AM, ingo randolf wrote:
>
> Hi.
>
> Is there a way to obtain the maximum texture size the graphic card is capable to manage via jitter?
>
> i searched the documentation, but couldnt find anything, so i tried a simple code i found on the web in an external, but max crashes... i guess jitter does not like things like this? how to make such gl-calls?
>
> here is the code i tried:
>
>
> void maxgltexsize_bang(t_maxgltexsize *x)
> {
> int argc;
> char **argv;
>
> GLint tex_size;
> argc=1; //i suppose this is 1?
> argv = NULL; //i suppose this is NULL? or the path to max itself??
>
>
> glutInit(&argc, argv);
> // do i need to initialize GLUT? should be initialized when starting Jitter?
>
> // glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGB|GLUT_DEPTH);
> // glutCreateWindow("");
> //do i need to create an gl-context?
> // i tried with and without this 2 lines... also creating a jit.window...
>
> glGetIntegerv(GL_MAX_TEXTURE_SIZE, &tex_size);
> post("GL_MAX_TEXTURE_SIZE %i
> ", (int)tex_size);
> }
>
> crash-report shows me, that max crashes when calling glGetIntegerv
>
> thanks for advice...
> io
>

ingo's icon

Hey wes.
Thanks for this information!

...
Create [jit.window ctx] [jit.gl.render ctx] [jit.gl.sketch ctx], bang the renderer at least once to create the gl-context and then send the msg to the sketch...