Data Sheet
SSM2167
Rev. G | Page 9 of 12
LEVEL DETECTOR
The SSM2167 incorporates a full-wave rectifier and a true rms
level detector circuit whose averaging time constant is set by an
external capacitor (CAVG) connected to the AVG CAP (Pin 6).
For optimal low frequency operation of the level detector down
to 10 Hz, the value of the capacitor should be 2.2 μF. Some experi-
mentation with larger values for CAVG may be necessary to reduce
the effects of excessive low frequency ambient background noise.
The value of the averaging capacitor affects sound quality: too
small a value for this capacitor may cause a pumping effect for
some signals, whereas too large a value can result in slow response
times to signal dynamics. Electrolytic capacitors are recommended
here for lowest cost and should be in the range of 2 μF to 22 μF.
The rms detector filter time constant is approximately given by
10 × CAVG milliseconds where CAVG is in μF. This time constant
controls both the steady state averaging in the rms detector as well
as the release time for compression, that is, the time it takes for
the system gain to increase due to a decrease in input signal. The
attack time, the time it takes for the gain to be reduced because
of a sudden increase in input level, is controlled mainly by internal
circuitry that speeds up the attack for large level changes. In most
cases, this limits overload time to less than 35 ms.
The performance of the rms level detector is illustrated in
(not shown) is a series of tone bursts in six successive 10 dB
steps. The tone bursts range from 66 dBV (0.5 mV rms) to
6 dBV (0.5 V rms). As illustrated in these figures, the attack
time of the rms level detector is dependent only on CAVG, but the
release times are linear ramps whose decay times are dependent
on both CAVG and the input signal step size. The rate of release is
approximately 240 dB/s for a CAVG of 2.2 μF, and 12 dB/s for a
CAVG of 22 μF.
CONTROL CIRCUITRY
The output of the rms level detector is a signal proportional to
the log of the true rms value of the buffer output with an added
dc offset. The control circuitry subtracts a dc voltage from this
signal, scales it, and sends the result to the VCA to control the
gain. The gain control of the VCA is logarithmic—a linear change
in control signal causes a dB change in gain. It is this control
law that allows linear processing of the log rms signal to provide
the flat compression characteristic on the input/output charac-
O
U
T
P
UT
(
d
B)
INPUT (dB)
VRP
VDE
10:1
5:1
2:1
1:1
1
VCA GAIN
02
62
8-
0
18
Figure 17. Effect of Varying the Compression Ratio
SETTING THE COMPRESSION RATIO
Changing the scaling of the control signal fed to the VCA causes
a change in the circuit compression ratio, r. This effect is shown
in Figure 17. Connecting a resistor (RCOMP) between Pin 8 and VDD sets the compression ratio. Lowering RCOMP gives smaller
compression ratios as indicated
in Table 4. AGC performance is
achieved with compression ratios between 2:1 and 10:1, and is
dependent on the application. Shorting RCOMP disables the AGC
function, setting the compression equal to 1:1. If using a compres-
sion resistor, using a value greater than 5 kΩ is recommended.
If a value lower than 5 kΩ is used, the device may interpret this
as a short, 0 Ω.
Table 4. Setting Compression Ratio
Compression Ratio
Value of RCOMP
1:1
0 Ω (short to V+)
2:1
15 kΩ
3:1
35 kΩ
5:1
75 kΩ
10:1
175 kΩ