fft @ ifft on GPU - glsl shader approach?

vade's icon

Hello

I have been doing some reading on convolution and blurring
techniques, specifically here: http://www.jhlabs.com/ip/blurring.html

My goal is to recreate in a jitter shader the triangle and lens blur
filters as shown in this page.

it requires (at least, im told, perhaps there is another method I am
unawares of) doing an fft on the incoming image, bluring it out just
so so that high frequency noise is removed and then running an
inverse fft and outputting the ifft as the texture.

i have researched some GPGPU techniques for running ffts on the GPU,
but have not seen any usable glsl code that I can put into a shader.

has anyone attempted this? Am i insane?

Ive found a few implementations of gpgpu ffts, namely the BROOK.
GPUFTW, AND CUDA method. However, they all seem to require special
compilers and odds and ends. Anyone want to point me in the right
direction - I dont need a finished solution, just something I can get
working in a shader myself.

Perhaps attempting fft is wrong/foolhardy and there is a convolution
method for approximating those blurs? Id be happy to work with
someone as long as the results are open source (I plan on releasing
the shaders I am woring on eventually :) )

Let me know!

v a d e //

www.vade.info
abstrakt.vade.info

daniel e mcanulty's icon

Did this ever go anywhere? I'm looking into the same thing, specifically that it would be nice to be able to change the kernel size without affecting the processing time, which seems like it would be best to use fft for. but my first experiments so far seem to be having trouble. in the meantime jit.convolve is nice.

vade's icon

No, l never really got too far with it. It seems unnecessary in the end, as you can do shaped blurs and various trickery faster than doing it with an FFT, stuff, iFFT, in the end it seems just doing shaped blurs and math in normal image space is doable and ends up being fine.

daniel e mcanulty's icon

I'm still hacking at it out of sheer bloodymindedness. I put it down for a while to learn python and am now trying to do it bit by bit in scipy so I can figure out what's going wrong with my jitter implementation. I'll post if I get anything good together.