## Cel-shading code, can it work in Jitter?

Apr 10 2007 | 3:45 pm
Would it be possible to implement code like that in Jitter by programming a shader somehow? I'm just now learning about rendering, but even if it takes me a year to learn I'd like to eventually get that effect.
I searched the archives, but only found one post about cel-shading saying that as of 2004 Jitter couldn't handle vertex programs.

• Apr 10 2007 | 4:05 pm
Yes, but they're cheating a little bit IMHO. They calculate the outline on the CPU while this is typically done on the GPU. That said, their shaders would be straightforwrd to put in a JXS file.
wes
On 4/10/07, Aaron Faulstich wrote: > > I've been reading about cel-shading, and came across this, complete with source code http://www.humus.ca/index.php?page=3D&ID=58 > > Would it be possible to implement code like that in Jitter by programming a shader somehow? I'm just now learning about rendering, but even if it takes me a year to learn I'd like to eventually get that effect. > > I searched the archives, but only found one post about cel-shading saying that as of 2004 Jitter couldn't handle vertex programs. > > Any help/advice is appreciated! >
• Apr 10 2007 | 11:25 pm
How can the CPU code for outlining be changed to run on the GPU?
• Apr 11 2007 | 3:25 am
I should qualify my original statement a bit. If your geometry is static as theirs is, it's probably best to precalculate things as is done in their C code. If you model is dynamic, this is going to be costly. To do it on the GPU, you need some metric that defines and edge. The two most widely used are depth discontinuities and normal discontinuities. If the dot product of a normal with its neighbor is close to or below zero, they are facing different directions and you could say that an edge was located there and color it black to indicate a line. Similarly, if the depth of a location is a particular Z distance from its neighbor, it could also be considered an edge.
Unfortunately, neighboring depth and normal information is not readily available in either a vertex or fragment shader, so you have to supply them via textures. For the normals, compress the range of values into [0, 1] with the formula texval = (normval+1)/2 and unpack in the shader. For depth values, you'll have to snag the depth texture.
HTH, wes
On 4/10/07, Aaron Faulstich wrote: > > How can the CPU code for outlining be changed to run on the GPU? >
• Apr 11 2007 | 11:52 pm
Thanks for your posts. I understand what you're saying conceptually, but sadly I don't know how to go about implementing it (I don't even understand the shader code in the make-your-own-slab tutorial yet).
Hopefully when I have some more free time I can get a better grasp on all this.