Forums > Jitter

iterative texture sampling within fragment program

April 19, 2007 | 4:03 pm

yes, full of questions, aren’t I?
as an exercise, perhaps with useful outcome, I’m looking to condense the jit.gl.slab.gauss6x-example.pat into a single shader. Of course, I’ve hit a barrier – how do I pass a texture from one operation to another? All the examples simply process a texture once, for example, we sample the incoming texture:
uniform sampler2DRect image;
and often somewhere in the fp, there will be a texture2DRect operation that displaces a texture (faked from gaussian shader):
vec4 sampleM = texture2DRect(image, texcoord);
vec4 sampleB0 = texture2DRect(image, texcoord – width);
gaussed = 0.1752 * sampleM + 0.1658 * sampleB0;
and we can output the texture in the end:
gl_FragColor=gaussed;

But what if I want to iterate that "gaussed" result again?
vec4 sampleM = texture2DRect(gaussed, texcoord);
vec4 sampleB0 = texture2DRect(gaussed, texcoord – 2.0*width);
doesn’t work, because "gaussed" isn’t the result of a sampler operation, which according to the Orange Book, texture2dRect requires.
So how do I displace a texture without using the texture2DRect function? I don’t see any examples of this or any discussion of this in the OB, so I’m stumped.

Perhaps there’s also a gap in my understanding of how to manipulate the convolution calculation to get the same result of a nice blur effect, but I’m still curious about how I might iteratively process a texture in a fragment progam…

Peter.


April 19, 2007 | 4:29 pm

celebrating 666 unread messages in the jitter list, if you do get that
condensed gauess6x shader working pls share.
and your q&a on writing shaders is great

On 4/19/07, pnyboer

wrote:
>
>
> yes, full of questions, aren’t I?
> as an exercise, perhaps with useful outcome, I’m looking to condense the
> jit.gl.slab.gauss6x-example.pat into a single shader. Of course, I’ve hit
> a barrier – how do I pass a texture from one operation to another? All the
> examples simply process a texture once, for example, we sample the incoming
> texture:
> uniform sampler2DRect image;
> and often somewhere in the fp, there will be a texture2DRect operation
> that displaces a texture (faked from gaussian shader):
> vec4 sampleM = texture2DRect(image, texcoord);
> vec4 sampleB0 = texture2DRect(image, texcoord – width);
> gaussed = 0.1752 * sampleM + 0.1658 * sampleB0;
> and we can output the texture in the end:
> gl_FragColor=gaussed;
>
> But what if I want to iterate that "gaussed" result again?
> vec4 sampleM = texture2DRect(gaussed, texcoord);
> vec4 sampleB0 = texture2DRect(gaussed, texcoord – 2.0*width);
> doesn’t work, because "gaussed" isn’t the result of a sampler operation,
> which according to the Orange Book, texture2dRect requires.
> So how do I displace a texture without using the texture2DRect
> function? I don’t see any examples of this or any discussion of this in the
> OB, so I’m stumped.
>
> Perhaps there’s also a gap in my understanding of how to manipulate the
> convolution calculation to get the same result of a nice blur effect, but
> I’m still curious about how I might iteratively process a texture in a
> fragment progam…
>
> Peter.
> –
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> The Lifespan
> 2 oz brandy
> 1/2 oz orgeat
> 1/2 oz maraschino
> 3/4 oz mix of (meyer) lemon and (yellow) grapefruit juice
> Shake with ice and serve in a Pernod-rinsed cocktail glass
>


April 19, 2007 | 4:42 pm

ok, glad it’s appreciated, cuz there’s lots more coming!
yes, I’ll definitely share it if this operation turns out to be practical. part of the shader mystique seems to be knowing if something is reasonable or not…see pvs posts on "if/then" branching in shaders…


April 19, 2007 | 4:51 pm

by cascading many slabs together.

As for this:
But what if I want to iterate that "gaussed" result again?
vec4 sampleM = texture2DRect(gaussed, texcoord);
vec4 sampleB0 = texture2DRect(gaussed, texcoord – 2.0*width);
doesn’t work, because "gaussed" isn’t the result of a sampler
operation, which according to the Orange Book, texture2dRect requires.

I don’t quite get what you’re doing here, but if I understand
correctly, you’re trying to call sampler2DRect on a value that is not
a texture. Doesn’t really make sense. Those functions require
textures. You can’t use them for anything else. What precisely do
you mean by displace a texture with the texture2DRect function?

wes

On 4/19/07, yair reshef wrote:
> celebrating 666 unread messages in the jitter list, if you do get that
> condensed gauess6x shader working pls share.
> and your q&a on writing shaders is great
>
>
>
>
> On 4/19/07, pnyboer

wrote:
> >
> > yes, full of questions, aren’t I?
> > as an exercise, perhaps with useful outcome, I’m looking to condense the
> jit.gl.slab.gauss6x-example.pat into a single shader. Of course, I’ve hit a
> barrier – how do I pass a texture from one operation to another? All the
> examples simply process a texture once, for example, we sample the incoming
> texture:
> > uniform sampler2DRect image;
> > and often somewhere in the fp, there will be a texture2DRect operation
> that displaces a texture (faked from gaussian shader):
> > vec4 sampleM = texture2DRect(image, texcoord);
> > vec4 sampleB0 = texture2DRect(image, texcoord – width);
> > gaussed = 0.1752 * sampleM + 0.1658 * sampleB0;
> > and we can output the texture in the end:
> > gl_FragColor=gaussed;
> >
> > But what if I want to iterate that "gaussed" result again?
> > vec4 sampleM = texture2DRect(gaussed, texcoord);
> > vec4 sampleB0 = texture2DRect(gaussed, texcoord – 2.0*width);
> > doesn’t work, because "gaussed" isn’t the result of a sampler operation,
> which according to the Orange Book, texture2dRect requires.
> > So how do I displace a texture without using the texture2DRect
> function? I don’t see any examples of this or any discussion of this in the
> OB, so I’m stumped.
> >
> > Perhaps there’s also a gap in my understanding of how to manipulate the
> convolution calculation to get the same result of a nice blur effect, but
> I’m still curious about how I might iteratively process a texture in a
> fragment progam…
> >
> > Peter.
> > –
> > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> > The Lifespan
> > 2 oz brandy
> > 1/2 oz orgeat
> > 1/2 oz maraschino
> > 3/4 oz mix of (meyer) lemon and (yellow) grapefruit juice
> > Shake with ice and serve in a Pernod-rinsed cocktail glass
> >
>
>
>
>
>


April 19, 2007 | 5:11 pm

well, the idea is to *not cascade slabs together, but just contain it all in one shader. it seems a bit tider. like I said, it’s sort of an exercise, to see if such a thing is possible.

>I don’t quite get what you’re doing here, but if I understand
>correctly, you’re trying to call sampler2DRect on a value that is >not a texture

right, that was my point. As far as I can tell texture2DRect(,) can be used to assign a coordinate to a texture, so by sampling coordinates, manipulating them, then using that result in texture2DRect, you displace the original texture’s fragments from its original coordinates, right? That’s how you get useful manipulations like the repos, cartopol, lumadisplace, etc.
The idea I had was to do that, then take the result of that, and instead of passing it to the output gl_FragColor, I’d pass it to another process to change it more.

P.


April 19, 2007 | 6:00 pm

Hi Peter,
I think you may be confused here.

A shader functions by processing pixel-by-pixel in a massively parallel
way. What this means is that you don’t have access to the output of
operations done on other pixels within the context of a single shader.
This type of effect is not possible without using cascaded slabs.

To clarify, the texture2DRect function takes 2 arguments:

–texture name – this is the texture that is sampled

–texture coordinates – this is the coordinates of the sample that you
are taking for that specific pixel output. Manipulating this will
variable will give you repos-like effects.

Basically, you can’t sample a texture that doesn’t exist yet. That’s
why this patch uses multiple slab instances.

Hope this helps you to understand what is going on. Feel free to keep
asking questions. These sorts of things are confusing, but they are
vital to getting more familiar with shader writing.

Best,
Andrew B.


April 19, 2007 | 7:55 pm

>you don’t have access to the output of
>operations done on other pixels within the
>context of a single shader

this is sort of what I suspected once I realized that it wasn’t obvious how to do what I wanted.
Thanks for the clarity. More questions will be coming! :)

Peter.


Viewing 7 posts - 1 through 7 (of 7 total)