why does jit.sobel look so much better than cf.sobel.jxs?

Mar 31, 2010 at 6:08pm

why does jit.sobel look so much better than cf.sobel.jxs?

I've recently been porting a project from all Jitter to OpenGL, and discovered that the rendering quality is much poorer when using OpenGL shaders, specifically for edge detection (jit.sobel vs cf.sobel.jxs)

The OpenGL looks very noisy and artifact-y, and with lower vertical resolution.

Here's a patch and a screenshot

– Pasted Max Patch, click to expand. –


  1. jitter_vs_opengl.jpg
Mar 31, 2010 at 8:17pm

try adjusting the with parameter to the sobel shader in the slab.
( “param width $1″ in a message box).

values less than the default of 1 seemed to get it closer to the CPU version.

Mar 31, 2010 at 8:59pm

That seems to help, though the results are never quite identical.

Is there anyway of defining the default shader parameters in the jit.gl.slab object (eg. @param width 0.4)? It looks like param is a message, not an attribute, so it doesn’t seem to work.

Apr 1, 2010 at 5:00pm

try typing exactly what you wrote into jit.gl.slab.



You must be logged in to reply to this topic.