shader writing woes (how to draw a new texture from another one's colour values)
hello everybody,
i want to write a shader that scans an incoming texture for it's colour values, and uses these colour values as coordinates to draw a new texture.
i am a total beginner in this area, and in the last weeks i looked into the subject and am slowly starting to understand the concepts of glsl.
as far as i understand it to reach my goal i have to do this "construction process" of the texture in a vertex program.
however, while fragment programs are quite easy to understand, and there are a lot of nice introductions to the matter, vertex programs are not.
there is a wonderful simple introduction here on this site for shaders in general, and fragment programs in specific, but unfortunately there is no comparable introduction to working with vertices.
i couldn't find any example code that comes close to what i want to achieve.
this is how i think it should work:
- set up some uniform sampler2drect (map texture)
- set up some varying kind of array or new "result" texture (with the dimesions of the range of max. colour values)
// this is the most difficult part for me to understand, as i don't really understand the types of arrays/buffers/etc and how they are set up/accessed/written to
- set all cells in "result" to 0 as default
vp:
- read the incoming "map" texture
- iterate through each of the cells of the "map" texture and read out the colour values
- set each cell that occurs in this lookup process to 1 in the "result" array // i know colours have to be vec4 - this is only a simplification
- pass the "result" coordinates to the fragment processor to use as texture coordinates
fp:
- read the data the "result" array is holding at the current position
- paint the current cell according to this data
-> output
can anyone please give me some advice on this matter, or point me to some comparable glsl code that i could modify?
i tried to do a "poor man's version" of what i intended, doing all the lookup in the fragment shader (running through all the pixels of the "map" matrix at each fragment position).
i can't get it to work. can somebody please give me advice on this matter?
error message: jit.gl.shader: unable to bind GLSL uniform param: 'tex0'
---------------------
---------------------
I'm not sure I understand what you're asking 100%, but I think you might want to look at the td.repos.jxs shader, and/or the other texture displacement shader. See examples/jitter-examples/render/slab-helpers/texdisplace.
In general for per pixel operations, you'll need to do them in the fragment program, rather than the vertex program, but it sounds like you may have already figured this out.
Good luck!
-Joshua
hello joshua,
thanks for your reply - i feel kind of lost righht now.
i did check out td.repos.jxs, and actually it was one of my first hopes when i started on this. still it's too far apart from my current problem.
here's what i want to do:
(taken from an earlier post of mine from another thread):
--
let's say i have a matrix with dim 2x2 char and 2 planes
the values are
cell 0/0 - 4; 7
cell 0/1 - 3; 8
cell 1/0 - 1; 2
cell 1/1 - 5; 0
(accoding to cell x/y - plane1 value; plane2 value)
now with the values of the planes i want to fill a new matrix, let's say it's dim 8x8 char with 1 plane.
i want to draw a point at every location specified by the contents of the first matrix.
so i draw white points:
cell 4 / 7 - 255
cell 3 / 8 - 255
cell 1 / 2 - 255
cell 5 /0 - 255
the rest of the pixels stay black.
--
so what i try to do in the code above is
feed the shader 2 matrices,
one black with the desired output dimesions,
one with the "map".
at each fragment location i iterate through _all_ of the pixels of the map matrix.
if one of the pixels holds colour values that correspond to the current fragment position (r=x g=y) i draw the current fragment white and break the iteration loop,
else it stays black.
question: does the output of a shader _always_ correspond to the fed in first matrix (tex0 in this case)?
the above code does not work however.
i built it inside apple's shader builder, and it compiled, and gave me no compile errors.
in max however i get the following error message:
jit.gl.shader: unable to bind GLSL uniform param: 'tex0'
btw. here's the corresponding test patch i used:
about the vertex approach:
i figured out it is possible to read textures in the vertex program.
so it should be theoretically possible to "construct" a texture;
in the form of some array, that is filled with white or black in the fragment processor at a later stage.
i don't know where to start with this though.
what kind of array would i have to set up for using this?
While it's possible to read values from a texture in the vertex program (depending on your hardware), you can only do so per vertex, and you won't be able to write to a texture from the vertex program. You will need to set up a varying variable to pass the values to the fragment.
Here's where you might be getting into trouble though: The fragment program runs once per-pixel of the output buffer, and it is unaware of what the results are for any other pixel. This means that it's impossible to do setcell-like operations within the shader program.
Based on what I'm hearing, you might be able to do something like this:
- create a mesh with a vertex for each point in your texture, using points as the draw_mode
- in the vertex program, reposition each vertex according to the values in the associated texture point. You will need to rescale the texture values to be signed values, normalized to the y-axis. (glPosition=).
- Now just make sure that the fragment program is coloring your points appropriately (not really clear what you want to do with that)
- capture all this to a texture. You might want to adjust the point_size for your mesh as well.
Give this a go and post what you come up with.
hi andrew,
/ Here's where you might be getting into trouble though: The fragment program runs once per-pixel of the output
/ buffer, and it is unaware of what the results are for any other pixel. This means that it's impossible to do setcell-
/ like operations within the shader program.
i know that, but i tried to overcome that problem with the code above:
the fragment processor can read out cell values from input textures, that's how the repos shader works.
so i thought i'd read through ALL cells of the input texture for each fragment that i'm processing.
then compare the input texture cell's colour values with the current fragment.
i know that's probably way too complicated - i just want to get that running.
anyway, maybe you can tell me what's wrong with the above code?
i'd be very happy for any help.
your vertex solution sounds good, thanks, but i read somewhere that it's possible to write textures in vertex processor as well! (some nvidia webpage, can't find it right now).
i just thought one could "misuse" a geometry vertex as coordinate system for a texture.
i'm lacking understanding of the whole vertex thing i guess.
@cycling: please do write a nice introduction to vertex programs like this one:
https://cycling74.com/tutorials/your-first-shader/
@andrew: sorry, i just noticed that tutorial was written by YOU! it's great - it helped me understand the basics, but a similar one for basics of vertex programs would be awesome!
:)
You cant *write* to textures in the vertex stage. The only way you can *write* to a texture is via a few OpenGL mechanisms like updating the texture or portions of it via glTexImage or glTexSubImage and friends, doing a glCopyTexture from a read buffer, or rendering into an FBO or PBuffer that has a texture attachment (which is how the jit.gl.slab works). A vertex shader literally only spits out transformed vertices (ie, geometry). That is all it can *ever* do. You can go back and forth between vertices and textures, but this requires setting up additional buffers and using the above mentioned "render to FBO" and then copying to an additional buffer, which is slower, and can be cumbersome.
Now you might be confused by Vertex Texture Fetch, which allows you to *sample* textures with nearest neighbor filtering in the vertex shader, useful for displacement mapping or for terrains. It wont allow you to render out a *new* texture.
As for reading every texel in the fragment shader, that is *a lot* of operations per pixel, and not a good approach. Its suggested that you try to do convolution style operations, which means you read in pixel values at, or around the texel at the current texture coordinate, and then do calculations to write out the fragment color you want.
Is that helpful?
I personally would stay away from creating loops that iterate over every pixel of the texture for each vertex. Sampling a texture in GLSL is one of the more expensive operations, so you'll want to minimize instances where that has to occur. While it might be fine, I would feel superstitious about that part. In the case of your code, what are you using as a fragment program?
Attached is a simple shader program that offsets each vertex based on RGB texture input. You might be able to glean some info from that.