Forums > Jitter

why does jit.sobel look so much better than cf.sobel.jxs?

Mar 31 2010 | 6:08 pm

I've recently been porting a project from all Jitter to OpenGL, and discovered that the rendering quality is much poorer when using OpenGL shaders, specifically for edge detection (jit.sobel vs cf.sobel.jxs)

The OpenGL looks very noisy and artifact-y, and with lower vertical resolution.

Here's a patch and a screenshot

-- Pasted Max Patch, click to expand. --


  1. jitter_vs_opengl.jpg


Mar 31 2010 | 8:17 pm

try adjusting the with parameter to the sobel shader in the slab.
( "param width $1" in a message box).

values less than the default of 1 seemed to get it closer to the CPU version.

Mar 31 2010 | 8:59 pm

That seems to help, though the results are never quite identical.

Is there anyway of defining the default shader parameters in the object (eg. @param width 0.4)? It looks like param is a message, not an attribute, so it doesn’t seem to work.

Apr 01 2010 | 5:00 pm

try typing exactly what you wrote into


Viewing 4 posts - 1 through 4 (of 4 total)

Forums > Jitter