Jxs specs, defining array as parameter


    Sep 05 2006 | 3:32 pm
    Hi,
    We're working on a shader program that requires an array (of vec2) as parameter. My first question is, is there a complete description of the jxs format available? And more specific: is there a way to input an array to a shader in jitter?
    the complete syntax for the array we wish to initialize is:
    uniform vec2 ParamArray[12];
    Thanks in advance!
    Mattijs

    • Sep 05 2006 | 4:01 pm
      On 9/5/06, Mattijs Kneppers wrote:
      >
      >
      > Hi,
      >
      > We're working on a shader program that requires an array (of vec2) as
      > parameter. My first question is, is there a complete description of the jxs
      > format available? And more specific: is there a way to input an array to a
      > shader in jitter?
      Hoi Mattijs. Afaik jxs is just GLSL, with a little code to bind the
      attributes. You can't input an array as a parameter, but you can "abuse"
      one of the arrays that are meant for other things, like textures, or normal
      arrays.
      If you don't need to access the array in the vertex program, you can use
      textures, otherwise use one of the arrays that are accessible from within
      the vertex program (one of jit.gl.mesh inputs) All this stuff just came up
      in a recent tread by Ali Momeni. I suggest you check it out.
      grtz -thijs
    • Sep 05 2006 | 4:26 pm
      On 9/5/06, Mattijs Kneppers wrote:
      >
      > Hi,
      >
      > We're working on a shader program that requires an array (of vec2) as parameter. My first question is, is there a complete description of the jxs format available?
      If you look in Appendix C of the JitterTutorial PDF, there's a good
      breakdown of the JXS spec.
      > uniform vec2 ParamArray[12];
      For this you will need to either bind 12 vec2s or do something clever
      with the mat2, mat3, or mat4 formats. Depending what you're doing and
      how these params relate to vertex or fragment data, vertex aatributes
      or texture representation of this data might be better.
      wes
    • Sep 05 2006 | 9:16 pm
      thijs, wes, thanks for your quick reactions! Appendix C is exactly what I was looking for. I need the info in the fragment shader so both techniques (using mat2-4 or a texture) should work.
      To use a texture I'll have to figure out how to input 3 textures to a slab or shader object, we already use the 2 texture inputs (of a slab, that is). I was looking at 'glsl/jit.gl.shader runs on intel but not on ppc' thread (I assume that's the one you refer to, thijs) and it looks like Ali's patch works with 3 textures too. I'll check it out.
      We're working on a depth of field shader; for the blurring pass we need both the rendered scene and a depth texture (encoded in rgb). The array would be the blur sample x and y offsets.
      Thanks again,
      Mattijs
    • Sep 05 2006 | 9:40 pm
      Hi Mattijs,
      Have you looked at this paper:
      http://www.ati.com/developer/gdc/Scheuermann_DepthOfField.pdf#search=%22ati%20depth%20of%20field%22
      ?
      I implemented it the other day. Here's the steps I did:
      1) render scene to texture, writing not the depth, but the "blur"
      information to the alpha channel
      2) use that texture and render it fullscreen on a quad while applying
      a stochastic blur to the texture based on the amount of blur in the
      alpha channel
      The problem with this method is that it occupies the Alpha channel of
      the rendered scene, so you have to turn blend_enable off.
      A better way to do this is to, as you suggest, grab the depth texture.
      Here are the steps:
      1) render scene to a texture
      2) grab the depth buffer to a texture
      3) texture a fullscreen quad with the rendered scene texture and the
      depth texture
      4) in the shader apply the blur function based on the focal plane
      parameter and the depth texture values
      I don't think you need 3 textures to do this or such a large number of
      uniform params. What are they for? I implemented the stochastic blur
      with a bunch of precomputed constants that follow a Poisson
      distribution. Othewise, I only had to pass in 4 DOF params describing
      the camera lens such as focal distance, near and far focal planes, and
      a blur scaling factor which roughly corresponds to aperture.
      You can get the depth texture by send jit.gl.render the message
      "depth_grab myDepthTex", so you don't need to output it in RGB
      channels. The GPU will do it for you.
      My DOF shader patch is mostly done but could use some sprucing up. If
      you want to see it, let me know.
      wes
    • Sep 06 2006 | 10:33 am
      Wes, that sounds interesting. I'll contact you privately about this later today.
      Greets,
      Mattijs
    • Sep 06 2006 | 8:07 pm
      If you want to use an array as an uniform parameter, you can define as
      follows:
      vec4 myArray[x];
      where x = the number of vectors in your array. When binding the default
      param, make certain to include the exact number of values. An example
      shader follows.
      Andrew B.
      vec4Array Testing
    • Sep 07 2006 | 11:11 am
      Andrew, thanks!
      How does this translate to the param input of the object? Will it put [param blurkernel x x x x x x] into a vec2 ar[3]? Would it be a lot of work for you to include the jxs syntax for this?
      In the mean time we're getting on pretty well. Currently we have a fixed set of poisson disc constants and it looks pretty nice when we manually define 36 vec2's, but dynamic changes of sampling size + distribution would be great to play with. Other kernels can be applied to simulate different types of diafragm, and we would like to use bigger sampling size for offline compared to realtime rendering.
      Also, do you have experience with the following? For some reason the mac glsl compiler refuses to accept default values for array init while the PC version does.
      vec2 somearray[2] = {vec2(0,0),vec2(1,1)};
      seems to fail on mac but is fine on pc.
      Cheers,
      Mattijs
      ps, a correction on my previous post: mat2-4 won't do, we would need bigger-than-4x4 matrices to enable dynamic kernel length and improve the quality of the effect. Texture reads are slower than using an array, so it would be pretty cool to be able to input an array (a list?) to the shader.
    • Sep 07 2006 | 3:32 pm
      >Will it put [param blurkernel x x x x x x] into a vec2 ar[3]?
      Yes. It will accept one long list to fill the array and break it up into
      the appropriate sized vectors. Be sure to use exactly the right sized list.
      FWIW, I've only tested with vec4 arrays, as that is all that is
      documented in the orange book. It might be worth testing if other type
      arrays even work with GLSL.
      >vec2 somearray[2] = {vec2(0,0),vec2(1,1)};
      I'll look at this later today when I have the book in front of me, but I
      don't think the curly braces will work in this instance. Something like
      this will probably be more likely to succeed:
      vec2 somearray[2];
      somearray[0] = vec2(0.);
      somearray[1] = vec2(1.);
      If it works on Windows, but not Mac, then you probably are doing
      something that doesn't strictly conform to GLSL. The Mac GLSL compiler
      is much more strict with conformance.
      Cheers,
      Andrew B.
    • Sep 08 2006 | 11:09 am
      Ok, great, we got the array input working, with vec2. Luckily it is possible to define default values in the jxs header. We can now change the kernel from max :) Only problem now is that our video card (GeForce 7800 GT) doesn't seem to let us do dynamic loops, so the kernel size is not yet dynamically changeable. I hope we'll find some workaround for that, otherwise we'll have to make a shader file for every different kernel size. Should be possible to generate in max ;)
      Thanks for all the help!
      Mattijs