-3dBm vs -6dBu
Hoping some of you electronics gurus can help me understand something here...
I've heard for years that a drop of 3dB cuts a signal in half. I know that's referring to 3dBm (signal power). However, it seems that mixer faders are calibrated with volts (dBu, dBV), so dropping the fader by 6dB (based on the markings alongside the fader) cuts the amplitude of the signal in half. I realize that due to the differences between power quantities and field quantities a 3dBm cuts the power in half, while a 6dBu cuts the voltage in half. But, don't most mixers these days use volts instead of power for their fader markings? Shouldn't we be saying to drop it by 6dB, instead of 3dB, to cut it in half? Reducing the amplitude of a signal to 50% with the [gain~] object produces a drop of 6dB. Am I completely misunderstanding something?