jit.gl.texture behaving incorrectly
So I basically wanted to texture a block with a .png, using its alpha channel for alpha blending. Curiously, the only way I find I can reach the expected result is by manually editing the texture’s attributes list, regardless of whether the values held therein are changed (i.e., changing the characters to the values they already have).
Attached is a montage showing the changes. The yellow one to the left shows the value when the patch is loaded, right of that, the value after the first attribute list "change", and finally the result after 2 changes. If further changes are made it bounces between states 2 and 3. Even if I’m not doing the right thing in order to achieve the desired effect, this behavior is clearly wrong, so I guess this is a bug report. These are the attribute lists for the objects in question:
jit.gl.texture ren @apply replace @flip 0 @colormode argb @name wonker @file wonkerman2.png
jit.gl.gridshape ren @shape cube @scale 0.05 0.8 0.35 @position 0. 0.8 0. @dim 8 8 @lighting_enable 1 @smooth_shading 0 @cull_face 1 @blend_enable 1 @depth_enable 1 @fog 1 @enable 1 @texture wonker @name zombie
I’ve tried all sorts of things, such as deffering the texture assignment, banging the texture, etc., but only the actions described above lead to this effect (which incidentally means I can’t find a way to do what I originally wanted).
tried ‘automatic 0′ ?
Not before, just tried it and I got the same behavior.
I ran the patch on a Mac in class today (Radeon X1600XT IIRC), and I got the expected/desired behavior. At home I’m running Windows XP and a GeForce 8800GT.
Yeah – the differences in behavior of alpha channel in jit.gl on Windows and Mac needs attention paid by the Cycling74 folks.
I have had similar problems on Windows, know others that have had problems, and would definitely call this a bug.
I guess on some weird level I am glad it might not be an issue of Radeon vs. NVIDEA, since it was suggested to me that an NVIDEA card might work more predictably than my Radeon in Windows.
And to come completely clean, I know Baiame and tried his patch on my Mac and PC and can substantiate his claim:
on Windows, the third instantiation of a single jit.gl.texture creates a working alpha channel, whereas on Mac it just works when the patch is launched.
if someone can provide a simple patch with steps to reproduce and system specs, someone will take a look.
Figured out how to fix it- just load the texture after the first time rendering has begun. However, I can’t reproduce the exact problem in another patch (it works like hte original patch did on Mac), despite the 3 objects (render, texture, and gridshape) having the same parameters except the texture name and filename.
One of these things is not like the other on OSX vs. Windows. We rely on you, or maybe that’s vise versa???
thanks for the example. I’ll take a look.