Projection Matrices of multiple cameras
Hello,
can someone please shed some light on how Jitter / OpenGL handles multiple cameras?
Consider a patch with e.g. 2 different cameras [jit.gl.camera @name camA @lens_angle 30] and [jit.gl.camera @name camB @lens_angle 80].
What defines for a [jit.gl.shader @name shady] that e.g. manipulates vertices of a mesh [jit.gl.mesh @shader shady] which PROJECTION_MATRIX is applied in order to calculate clip-space output position gl_Position? The PROJECTION_MATRIX should be different for different lens_angles. Is the entire shader executed twice?
The vertex program of Jitter's default shader looks something like this:
<param name="modelViewProjectionMatrix" type="mat4" state="MODELVIEW_PROJECTION_MATRIX" />
<bind param="modelViewProjectionMatrix" program="vp" />
...
gl_Position = modelViewProjectionMatrix * vec4(position, 1.);
Basically I would like to understand how Jitter passes the lens_angle (respectively what is needed for https://www.khronos.org/registry/OpenGL-Refpages/gl2.1/xhtml/gluPerspective.xml ) for the calculation of MODELVIEW_PROJECTION_MATRIX .... and in case of e.g. 2 cameras if the shader is executed twice with different MODELVIEW_PROJECTION_MATRIX.
Is the entire shader executed twice?
Yes, each active camera will redraw the scene using its matrices, so the shader uniforms will be updated appropriately for each camera's draw-pass.