Capturing 10-bit Video (without truncating to 8-bit)?

    Feb 21 2014 | 4:27 am
    I'm trying to capture a 10bit HDMI signal from a BlackMagic Pocket Cinema Camera, hoping to preserve the entire 0-1023 dynamic range in each channel.
    Has anybody done this before?
    My capture box (Matrox MXO2 mini) has a 10-bit option in the input list that matches my resolution and frame-rate but jit.grab sometimes crashes when I choose it (works fine if I select the 8-bit variant).
    Assuming I can overcome that hurdle (driver update, voodoo) do I need to tell jit.grab to preserve the 10-bit range somehow? (I understand colormodes but I don't see any way to declare bit depths.)
    If jit.grab adopts the color depth of its input signal, do I just feed the output to a matrix of type long or float32? Are there more considerations if I'm going to process the matrix on the GPU?