In the radio-frequency (RF) microwave test and measurement world, engineers often deal with the power measurement unit of dBm instead of wattage (W). However, when entering the audio measurement arena also need to understand the unit dBu, which is decibel (dB) relative to 1 mW into 600 Ω.

The decibel, in and of itself, is an often misunderstood unit of measurement. The “bel” in “decibel” derives from the name of Alexander Graham Bell. He was interested in how the human ear responds to sound intensity. Bell used a logarithmic scale to express this sound intensity—its range from the softest sound to the loudest (threshold of pain) sound is one to a billion (1012), or zero to 12 bels. The decibel is one-tenth of a “bel” and is abbreviated as dB.

Using dB can be beneficial on two key fronts. First, it concisely expresses very large or very small ratios; for example, +63 dB to –153 dB is more compact than 2 x 106 to 0.5 x 10-15. Second, dB simplifies the mathematic process when comparing quantities used to multiply the gain or divide the loss of several cascaded devices. Addition replaces multiplication of numeric gain, and subtraction replaces division of numeric attenuation.

Defining The Decibel

A logarithmic unit, dB expresses the ratio of two quantities. In power measurement, the relative power is defined as:

In voltage measurement, the relative voltage is defined as:

To describe dB as an absolute value, a reference point must be known. A number of different reference points are possible:

  • dBm represents the power level P1 with reference to 1 mW
  • dBW represents the power level P1 with reference to 1 W
  • dBV represents the voltage level V1 with reference to 1 V rms
  • dBmV represents the voltage level V1 with reference to 1 mV rms
  • dBµV represents the voltage level V1 with reference to 1 µV rms

The most commonly used unit in power measurement is dBm. For instance, if an engineer is working in a known industry-standard environment, test-system impedance usually equals 50 Ω in RF engineering, 75 Ω in television engineering, and 600 Ω in audio engineering. A conversion formula will help engineers to convert power measurement of dBm to any unit of dBV, dBmV, or dBµV.

For a 50-Ω system:

  • dBV = dBm – 13 dB
  • dBmV = dBm + 47 dB
  • dBuV = dBm + 107 dB

For a 75-Ω system:

  • dBV = dBm – 11.25 dB
  • dBmV = dBm + 48.75 dB
  • dBuV = dBm + 108.75 dB

For 600-Ω system:

  • dBV = dBm – 2.22 dB
  • dBmV = dBm + 57.78 dB
  • dBuV = dBm + 117.78 dB

What Is dBu?

For most traditional test equipment, the source impedance uses only 50 Ω. However, audio test applications typically employ a 600-Ω source impedance. Audio test uses another decibel formula in the unit of voltage measurement—dBu. It’s defined as dB relative to 1 mW into 600 Ω. This logarithmic unit expresses the relative voltage measurement with reference to a voltage value of 0.7746 V rms (voltage drops across 600 Ω that results in 1 mW of power).

The dBm unit is defined as:

if a 600-Ω load results in 0 dBm. Therefore:

Therefore:

The “u” in dBu represents the word “unloaded.” It also implies that the load is un-terminated, or the load impedance is unspecified, and will likely be high. Thus, the 0.7746 V rms is an open circuit source.

Maximum Output Power

As mentioned earlier, 50 Ω is the most commonly used source impedance. A 50-Ω source impedance may result in higher short-circuit current (for a constant voltage), as well as 10 times the frequency response over a given length of cable, than with 600 Ω-source impedance.

For example, Figures 1a-1d illustrate the maximum power transfer delivered by Agilent’s U8093A audio analyzer into various load-impedance scenarios using the source impedance of 50 Ω or 600 Ω. The U8903A has an 8-V maximum voltage source for unbalanced output (VS).

1. In scenario 1, both source and load impedance are 50 Ω:

(a). Scenario 2 shows the source as 50 Ω and load impedance as 600 Ω:  (b). In scenario 3, both source and load impedance is 600 Ω:  (c). The source is 600 Ω and load impedance measures 50 Ω in scenario 4:  (d).

Ins And Outs Of Output Voltage

Thanks to recent advances in DSP-based RF test equipment, some RF engineers are able to measure audio on RF instruments and then correlate the test results with other audio instruments. Sometimes, though, engineers encounter problems with their RF signal analyzer when measuring two different supply sources that are identical in stimulus setup. One such example may occur between output frequency (FL) and output voltage (VL). The RF signal analyzer receives very divergent measurement results that show both inputs are unequal in amplitude or bandwidth.

If an engineer sets the voltage output of an audio generator that comes with a fixed source impedance of 50 Ω (VS = 2 V, or 8.24 dBu), then its voltage will drop across at 50 Ω load impedance (VL = 1 V) (Fig. 2). Thus:


2. The signal generator combines an audio signal and RF carrier with 50-Ω output source impedance.

The engineer then may set up another output of an audio generator with source impedance of 600 Ω. To achieve output performance that’s similar to the previous 50-Ω system, a higher output voltage of V’S = 13 V (24.5 dBu) must be set. Therefore, it also will deliver V’L = 1 V to the same 50-Ω load impedance (Fig. 3). Thus:

if V’L = VL = 1 V.

3. Here, an RF modulator with 50-Ω impedance modulates the audio generator with 600-Ω output impedance.

Converting dBu (in 50 Ω) to dBu (in 600 Ω) is a technique for verifying and confirming that the audio analyzer’s source impedance is the cause of the divergent measurement results in the RF signal analyzer. As a rule of thumb, dBu (in 600 Ω) = dBu (in 50 Ω) + 16.26 dB.

The U8903A comes with a switchable source impedance of 50 Ω or 600 Ω. After verifying and confirming the root cause, all that’s required is a modification to the output source’s to obtain the appropriate output voltage reading.