cv.jit.label mismatch type
I am trying to use Kinect depth data as the input to cv.jit.label, and I get a "mismatch type" error. In the helpfile for this object, it says that input can be any type of matrix. I have been able to get around the error message by using jit.coerce and changing it to type char and then jit.rgb2luma, but that introduces severe artefacts. Is there any way around this?
ah: the patch.
It turns out that an odd workaround is effective. I put an extra matrix between the input data and the cv.jit.label object. It works if the new matrix is type char; it does not if it is type float32.
I don't know why.
Here is the working patch:
Hey mate this isn't working for me, I'm getting the mismatched type error and I tried your fix but I'm still coming up with the mismatch type error when I feed the freenect.grab stream into it?