The first people to come up with interface standards were in the broadcast market. Attaching multiple products together to create a final product required a standard to make sure the output of one product wouldn't overdrive the next, and that the ideal amount of gain or attenuation could be added to a signal with ease.
Traditional interface standards were given at +4 dBu and -10 dBV (that's 1.23 Vrms and 0.316 Vrms for those without a calculator or the will to make the calculations). For many years, this was what differentiated consumer and professional equipment. Take any standalone CD player from the early 1990s and you'll quickly see the standard I'm talking about.
In the late 1980s and early 1990s people got a little pickier about what they connected to their television. The days of just hooking up the VHS machine via the coaxial aerial connection was slowly coming to an end. With the advent of higher quality audio sources (NICAM stereo, Laserdisc, etc.), audio started being transported across stereo RCA jacks - just like your Hi-Fi did.
Around the same time, SCART connectors in Europe started to become popular, too (Figure 1). SCART gave manufacturers a whole new playground of standards to play with. Not only were the connectors new, but the levels that people could play increased significantly.
Figure 1 - SCART Connector.
As with any interconnect, the higher the signal you push through, the higher the dynamic range you can get in the receiver as receivers typically have a fixed amount of noise. Along with the new connector came standards for voltage swing and for the expected impedance a receiver should impose on the driver. The SCART standard eventually grew to be 2 Vrms (5.6 Vpp) with an input impedance of 10 kΩ.
A ground-biased 5.6-Vpp signal is tougher to generate than one may at first think. Consider for a moment the products we connect in a home entertainment environment: DVD players, set-top boxes, gaming consoles, AV receivers, etc. The output of a typical 5-V DAC is around 4 Vpp, biased around 2.5 V. Lower operating voltage DACs will have even less signal swing biased around a lower voltage. Getting this signal into the outside world requires some level of gain and a level of buffering.
This nicely brings us to output topologies for taking DAC outputs to the real world. There are three main methods of doing this:
- bipolar-supply operational amplifier (op amp) with a split rail power supply;
- single-supply op amp with a DC bias of VDD/2, and a DC blocking capacitor on the output;
- single-supply op amp with a built-in charge pump to generate a negative supply rail, giving enough swing and a ground-biased output.
Each of these topologies has their own advantages and disadvantages.