Measuring noise/variation in an image
I'm wondering if there's some way to measure the amount of noise/variation in an image (or video frame). I'm not sure exactly how to describe what I want to measure, which is why I'm having trouble measuring it! Put simply, I want the image equivalent of measuring the "noisiness" of an audio sample -- that is, measuring its spectral flatness. My idea is that a "noisy" image would have lots of variation between neighboring pixels, while "flat" images would have little variation between neighboring pixels.
Is there something like this, or am I desiring something that just doesn't make sense?
not sure, but maybe jit.histogram or jit.fft might get you started?
Thanks for the tip, it looks like using jit.histogram combined with cv.jit.stddev can get close to estimating what I want, even if it's not exactly right.