Forums > Jitter

readback from gpu just for a 0 or 1?

February 20, 2013 | 11:14 pm

i am wondering…:
i have a vast chain of slabs doing all sorts of complicated things very nicely… but now i need to know just wether the luminance of one of these textures in between is above or below a certain threshhold. i searched for quite a bit but it seems i really have to readback to cpu for this… do i?

i attached what i currently do below… as soon as i connect it, i get a 20 fps framedrop – even with this tiny texture.
if i really have to read back, can this been done more efficiently?

thanks!
k

– Pasted Max Patch, click to expand. –

February 21, 2013 | 9:08 pm

don’t bother – i solved this for the time being by grabbing the signal before loading it on the gpu and then recreating the whole slabchain in cpu-land. not pretty but since i can work with a tiny, tiny matrix it does not affect performance…
k


March 14, 2013 | 12:08 am

some last remark on this if someone else should stumble across this problem: i did not try this in any way but i am just browsing through the filters of jit.gl.imageunit ( https://developer.apple.com/library/mac/#documentation/GraphicsImaging/Reference/CoreImageFilterReference/Reference/reference.html ) and there are a bunch of filters which return culminated values as single pixels which could probably be helpful with this sort of stuff… to get a value (not a pixel) you would still need to readback though.
k


Viewing 3 posts - 1 through 3 (of 3 total)