Linearized Depth Buffer Values for Depth of Field
I’ve built a patch to do a depth of field post-processing effect and was able to make it work but still have a problem: the depth buffer is non-linear. There is more precision close to the camera and less precision far from it. For some purposes, such as depth of field simulation, I think that is a problem. You can’t have very small near_clip values such as 0.1 without resulting in the depth image becoming almost all white (1.0 value), among other things…
So I was looking for a way to linearize the depth buffer values and came to the following article on Geeks3d website:
There’s a good explanation of the problem in that post and also the solution to my problem in the form of a GLSL code for a post-processing shader. Unfortunately, I’m having problems adapting the code to a jitter .jxs file. Can anyone take a look at it? I guess it would be useful for a lot of the community…
Thanks in advance.
Here’s the GLSL shader code from last post’s link. Is it doable in a GLSL Jitter shader?
Any GLSL guru out there? Thanks
gl_Position = ftransform();
gl_TexCoord = gl_MultiTexCoord0;
uniform sampler2D sceneSampler; // 0
uniform sampler2D depthSampler; // 1
float LinearizeDepth(vec2 uv)
float n = 1.0; // camera z near
float f = 100.0; // camera z far
float z = texture2D(depthSampler, uv).x;
return (2.0 * n) / (f + n – z * (f – n));
vec2 uv = gl_TexCoord.xy;
//vec4 sceneTexel = texture2D(sceneSampler, uv);
if (uv.x < 0.5) // left part
d = LinearizeDepth(uv);
else // right part
d = texture2D(depthSampler, uv).x;
gl_FragColor.rgb = vec4(d, d, d, 1.0);
I cannot test the shader right now but looking at the source I would suggest you to try changing the texture2D to texture2DRect and the sampler2D to sampler2DRect.
let me know how it goes!
Thanks for the reply, Emmanuel. Unfortunately, as I don’t have any GLSL skills I wasn’t able to make it work. I hope someone is interested in simulating depth of field solving the inherent problems of the depth buffer (near and far clipping values very difficult to work with…).
Here’s an example patch where I implement depth of field using Andrew Benson’s luminance based blur shader (http://cycling74.com/forums/topic.php?id=18001).
your patch is making my max crash.
OK, I’ve spent some more time and finally was able to implement what I was looking for. In Windows 7, the patch didn’t crash, but when I tried it on my MacBook Pro it crashed (altough in a previous version it didn’t…).
I guess the problem of the crash was related with the use of OpenGL’s built-in depth buffer. Instead of using it, I’ve built a shader to do the depth capture. Now it’s more stable, faster, and I no longer have the initial problem (non-linear depth buffer values).
Now, all is working as I intended, although I’m not a GLSL expert. So, if anyone is interested, take a look at it and maybe suggest some optimizations. I hope someone finds it useful.
Thank you for sharing this, really interesting and so few resources about DOF here …
Glad I could help.
I would love if one day Jitter could offer things like depth of field, global illumination, shadows or motion blur in a more straightforward way (a checkbox in jit.gl.render properties?), like the majority of current game engines do nowadays. Of course, without negating the possibility to users of building their own implementations, if they so desire.