I'm currently working on some experiments where I'm using jit.gl.pix to run a particle system which stays entirely on the GPU. I'm pleased with what I'm seeing so far, but have hit a bit of a dead end and have been banging my head against a brick wall trying to get past this, so wondered if somebody with some more jit.gl.pix or glsl experience could help me out...
The principle appears to work fine, and I've been able to make nice looking particle systems with well over 500,000 points that run at 30fps on my iMac, with a single texture holding particles x, y, age and opacity. However, I'm now wanting to use the particle x and y positions to look up data from another texture, and this is where I'm hitting trouble. The attached patch illustrates this: I am taking a camera input, passing this into a texture, and then I want the particles to take their opacity value from the nearest corresponding pixel in the camera image. I can achieve the effect that I want when I have the particles arranged in a nice neat grid (with coordinates generated from a jit.gencoord ) but when I use noise to generate the particle positions, it reveals that there's something going wrong in the sampling process. In the attached patch, I'd like the image from the camera to remain readable even as the particles positions become randomly spread: the jit.gl.pix is supposed to be sampling the camera texture based on the x and y positions of each particle. Try crossfading from the noise to the grid and see what happens.
Any input on this would be very much appreciated! I'm planning to combine this with a stam fluid implementation on the GPU and will share here if I get it all working!