UVcoords in shader?


    Jan 03 2007 | 5:10 pm
    Hi,
    I'm a total glsl rookie trying to write a shader that gives you individual control over the four cornerpoints of your UV. Like the "Perspective Transform" image unit. But I can't get it to work. I realize I need something other then the texture2DRect call, but when I try to use texture2DProj it gives me errors. I guess I need a varying vec4 texcoord somehow, but how?
    Any pointers in the right direction would be great.
    tia

    • Jan 03 2007 | 5:32 pm
      On Jan 3, 2007, at 9:10 AM, Adam Wittsell wrote:
      > I'm a total glsl rookie trying to write a shader that gives you
      > individual control over the four cornerpoints of your UV. Like the
      > "Perspective Transform" image unit. But I can't get it to work. I
      > realize I need something other then the texture2DRect call, but
      > when I try to use texture2DProj it gives me errors. I guess I need
      > a varying vec4 texcoord somehow, but how?
      > Any pointers in the right direction would be great.
      Check out any of the texdisplace shaders to see how to manipulate
      texture coordinates. td.resample.jxs is probably a good one to start
      getting a feel for this. td.rota.jxs is more along the lines of
      "Perspective Transform", but the bound mode logic is a little
      confusing. Here's esse ntially what that would look like if you
      didn't care about the bound modes. (*much simpler*).
      void main()
      {
      // where is the point?
      vec2 sizea = texdim0;
      vec2 point = texcoord0;
      //transformation matrices
      mat2 sca = mat2 (1./zoom.x,0.,0.,1./zoom.y);//scaling matrix (zoom)
      mat2 rot = mat2 (cos(theta),sin(theta),-sin(theta),cos(theta));//
      rotation matrix
      //perform transform
      vec2 tc = ((((point-anchor*sizea)*rot)*sca)+anchor*sizea)+offset;
      //sample textures
      gl_FragColor = texture2DRect(tex0,tc);
      }
      For more information regarding shaders, I would recommend checking
      out some of the resources listed at the following link, as well as
      NVidia's GPUGems series:
      -Joshua
    • Jan 03 2007 | 7:18 pm
      Thanks for the tips Joshua.
      I have tried to look on the net and the texdisp shaders, but I still don't get it. Every example I find deals with the transformation of a rectangle and not the individual points. Why can't it be easy as in Maya ;)?
    • Jan 03 2007 | 7:56 pm
      Hi Adam,
      It sounds like what you need is a little primer on linear algebra.
      Since what you are manipulating (texcoords) are cartesian coordinates,
      look for ways to tranform those coordinates into the shape, size, and
      location you are looking for. For a start, here's a page that discusses
      some basic transformation matrices:
      Good luck!
      Cheers,
      Andrew B.
    • Jan 03 2007 | 8:35 pm
      Are you tryting to do projective texture mapping like this:
      http://developer.nvidia.com/object/Projective_Texture_Mapping.html ?
      If so, you don't want to use a 4-component texcoord but a 3 component
      texcoord. The 3 components are the usual U, V but in homogenous
      coordinates with the 3rd value used for the perspective divide. For
      the 4 component version, the 3rd value holds a depth value. This for
      is used for shadow mapping.
      wes
      Here's a version of td.lumadisplace.jxs but in GLSL.
      Amplitude of displacement (x,y)
      Luminance based texture displacement
      Offset
    • Jan 03 2007 | 9:07 pm
      What I'm trying to achieve is a way to do perspective transformation in the texture space itself. The point being that I then could have several texture layers composited together with slabs ending up on one videoplane, but still have 3D-control over each layer individually.
      So far I managed to get around my lack of math by using the Jash 3Dmatrix objects and feed the result into the "Perspective Transform" image unit. The problem is that the effect forbids pointA to have a larger x-value then PointB. So I can only rotate the object +-90 degrees in y (patch below).
      I guess the best approach would be to have a vertex shader manipulate the slab geometry itself, so I just have to grit my teeth and get down with the matrices.
      Any other suggestions from you pros are very welcome.
    • Jan 03 2007 | 9:20 pm
      If you're just looking to do 4 point interpolation, where the four
      corners of the rectangle can be tied to new coords, this math is very
      simple and doesn't require learning about matrix transformations.
      Otherwise, you'll need to learn a bit of matrix math (or just use
      textured geometry with jit.gl.sketch or jit.gl.mesh instead of using
      slab)
      Here's a rough example of how you'd accomplish the four point
      interpolation in a jitter shader. Warning email client coded, so
      there might be mistakes.
      uniform vec2 bottomleft;
      uniform vec2 bottomright;
      uniform vec2 topleft;
      uniform vec2 topright;
      varying vec2 texcoord0;
      varying vec2 texdim0;
      uniform sampler2DRect tex0;
      void main()
      {
      // normalize our texture coordinate to range 0-1 based on texture
      dimensions
      vec2 normpoint = texcoord0/texdim0;
      // four point interpolation:
      // interpolate across X axis for top two points
      vec2 toptemp = xfade(topleft,topright,vec2(texcoord0.x));
      // interpolate across X axis for bottom two points
      vec2 bottomtemp = xfade(bottomleft,bottomright,vec2(texcoord0.x));
      // interpolate across Y axis with top/bottom temporaries
      vec2 tc = xfade(bottomtemp,toptemp,vec2(texcoord0.y));
      // if the four corners are specified with normalized texture
      coordinates, need to scale again
      tc = tc*texdim0;
      //sample texture
      gl_FragColor = texture2DRect(tex0,tc);
      }
      -Joshua
    • Jan 03 2007 | 9:50 pm
      Thank's Joshua! This looks exactly like what I'm after. It refuses to compile though. It says "ERROR: 0:19: 'xfade' : no matching overloaded function found". I think the code came through all right.
    • Jan 03 2007 | 9:57 pm
      That's because xfade() is not a valid built-in GLSL function. I think
      JKC meant to use mix() here.
      AB
    • Jan 03 2007 | 10:45 pm
      On Jan 3, 2007, at 1:57 PM, Andrew Benson wrote:
      > That's because xfade() is not a valid built-in GLSL function. I think
      > JKC meant to use mix() here.
      Ja. Sorry about that. Was just typing real quick in email client, and
      spaced out on the GLSL mix call.
      -Joshua
    • Jan 03 2007 | 10:53 pm
      I got it up and running now, but it seems that the transformation is nonlinear. I modified your code to include the normpoint you declared, but didn't use, but it made no difference. It's a very cool effect, but not quite what I was after ;).
      {
      vec2 normpoint = texcoord0/texdim0;
      vec2 toptemp = mix(topleft,topright,vec2(normpoint.x));
      vec2 bottomtemp = mix(bottomleft,bottomright,vec2(normpoint.x));
      vec2 tc = mix(bottomtemp,toptemp,vec2(normpoint.y));
      tc = tc*texdim0;
      gl_FragColor = texture2DRect(tex0,tc);
      }
      [img]index.php?t=getfile&id=406&private=0[/img]
    • Jan 04 2007 | 9:41 pm
      Hi Adam,
      Could you send us a copy of the complete shader that you are using?
      It's very strange that you are getting these non-linear results.
      Cheers,
      Andrew B.
    • Jan 04 2007 | 9:48 pm
      Here it is.
      I probably did something wrong when I tried to change it.
      topleft
      shader for performing srcdim/dstdim operations
      topright
      bottomright
      bottomleft
    • Jan 05 2007 | 2:11 am
      On Jan 3, 2007, at 2:53 PM, Adam Wittsell wrote:
      > I got it up and running now, but it seems that the transformation
      > is nonlinear. I modified your code to include the normpoint you
      > declared, but didn't use, but it made no difference. It's a very
      > cool effect, but not quite what I was after ;).
      Technically bilinear interpolation isn't a linear operation. Sorry
      for this false solution to your problem. I thought that this would
      give you roughly the results you were looking for, but obviously not.
      What you'll need to do to get the exact same solution as the
      "Perspective Transform" image unit is solve for the rectangular plane
      in 3d space which projects to the four screen points specified. This
      will definitely require some linear algebra.
      If you don't care about the perspective transform and that the shape
      is a rectangle in 3d space (instead of some arbitrary quadrilateral
      in 2d space), the problem is a little bit simpler, and you can easily
      accomplish with sketch and glvertex/gltexture instead of using
      jit.gl.slab, letting OpenGL do the rest for you.
      Finally, another thing to mention for people interested in linear
      transforms to texture coordinates without using shaders, keep in mind
      that you can use the tex_plane_s and tex_plane_t attributes of
      jit.gl.* objects in tex_mode 1 and 3 (search archives for some
      examples), or use gltranslate/glscale/glrotate with glmatrixmode
      texture in sketch. These approaches of course can invalidate
      assumptions made by the passthrudim vertex shader that grabs the
      diagonal texture matrix values to establish pixel width height for
      rectangular textures.
      Probably more detail with less usable information than you were
      looking for, but hopefully this points you in the right direction if
      you want to get your hands dirty.
      -Joshua
    • Jan 06 2007 | 2:03 pm
      Thank's for the tips guys. Will try to get down with the matrices.
      Cheers
    • Jan 06 2007 | 4:43 pm
      A late follow-up...
      Here are some great free video lectures for all who missed them in school:
      They helped me realizing that jit.la object names are not referencing Los
      Angeles...
      > It sounds like what you need is a little primer on linear algebra.