Forums > Jitter

why does jit.sobel look so much better than cf.sobel.jxs?

March 31, 2010 | 6:08 pm

I've recently been porting a project from all Jitter to OpenGL, and discovered that the rendering quality is much poorer when using OpenGL shaders, specifically for edge detection (jit.sobel vs cf.sobel.jxs)

The OpenGL looks very noisy and artifact-y, and with lower vertical resolution.

Here's a patch and a screenshot

– Pasted Max Patch, click to expand. –

[attachment=128649,312]

Attachments:
  1. jitter_vs_opengl.jpg

March 31, 2010 | 8:17 pm

try adjusting the with parameter to the sobel shader in the slab.
( "param width $1" in a message box).

values less than the default of 1 seemed to get it closer to the CPU version.


March 31, 2010 | 8:59 pm

That seems to help, though the results are never quite identical.

Is there anyway of defining the default shader parameters in the jit.gl.slab object (eg. @param width 0.4)? It looks like param is a message, not an attribute, so it doesn’t seem to work.


April 1, 2010 | 5:00 pm

try typing exactly what you wrote into jit.gl.slab.

;)


Viewing 4 posts - 1 through 4 (of 4 total)