How do I downsample a live video stream from camera with high quality?
I've been looking through all old threads on the subject of downsampling/upsampling, but nonetheless have not found any conclusive advice.
What I want to do is get a high quality stream at 40x60, but without all the noise in the image that "just" connecting a high resolution jit.matrix to a low resolution one gives. (As an aside, what downsampling algorithm is used when that happens? Averaging? Nearest Neighbour? ...)
The best approach seems to be the jit.gl.slab, as discussed in the follwing thread:
the patch In there is the one below, that I've altered only slightly to make it work.
However, on my PC, which has a modern OpenGL 2.0 card (NVIDIA Quadro FX 770M), I get a heap of errors from jit.gl.slab, no matter what settings I try on the Nvidia control panel:
jit.gl.readback: unable to create framebuffer: pbuffers not supported!!
jit.gl.readback: error initializing framebuffer: framebuffer objects not supported!
jit.gl.readback: unable to create framebuffer: pbuffers not supported!!
jit.gl.readback: unable to create framebuffer: pbuffers not supported!!
jit.gl.texture: error creating readback mechanism for capture!
jit.gl: invalid extension called
jit.gl: invalid extension called
jit.gl: invalid extension called
jit.gl: invalid extension called
Could anyone help me with alternative suggestions of how to get a high quality downsampled video stream?
And in the meantime, I'll post back whatever I come up with of course :)
Thanks!
Ilias B.
you may want to try out jit.gl.asyncread to render to a matrix, instead of using the software renderer.
check the help file for more info.