Bill Whitlock, president & chief engineer of Jensen Transformers had this to say about jitter and early ADCs used in the recording industry:
You quoted an e-mail from a reader "When many CD recordings were done from master tapes in the 1980s, they many times suffered from a poor A/D conversion, probably because the implementation of monolithic ADCs available in recording instrumentation was 8-bit and they also suffered significant DR loss in implementation. This would limit dynamic range to about 48 dB at best. The higher DR delta-sigma architecture ADCs for the recording industry didn't come about until the late 80s, but by then the recording industry didn't want to redo the recordings from original master tapes."
Since I was Manager of Electronic Development Engineering for Capitol Records from 1981 to 1988 and participated in the introduction of the CD media, I would like to add my comment about this subject. Early 80's A/D converters were invariably the successive approximation (SAR) types and were generally monotonic at 16 bits. But there were a few notable recordings made (some by prestigious European labels) with defective converters that were monotonic to only about 12 bits. I know of NO recordings ever made at 8 bits of resolution. However, the early digital recorders (the Sony "U-matic" video cassettes based ones were by far the most popular) used A/D converters that sampled at "baseband" or 44.1 kHz. This required that they be preceded by "brick wall" anti-alias filters (flat to 20 kHz and >90 dB attenuation at 22.05 kHz). The time-domain response of the early Sony recorders, because of their anti-alias filter design, was so awful that any novice listener could hear the damage done to high frequency transients such as cymbal crashes! I believe this is primarily what gave early CDs a terrible reputation, especially among audiophiles. Of course, when oversampling (non-baseband) A/D converters appeared later in the 80's, anti-alias filters became essentially a trivial design and their time domain response became near-perfect.
The reader who claimed that jitter is a big problem, I believe, absolutely right. In a CD player, jitter from the raw disc data stream is removed by a FIFO memory and the further processed/error-corrected data is clocked into the D/A by a local crystal oscillator. Designers of CD players seem to forget that crystals are piezoelectric (duh) and that their oscillation can be easily perturbed (i.e., jitter) by acoustic vibration coupled to them. I've always thought that these oscillators should be enclosed in a shock-mounted box to isolate them from vibration caused by the listener's speakers.
Another reader wondered about the mechanics of perfect pitch, and if it might be related to the perception of upper-frequency harmonics and how (or if) this played a role in the differences between vinyl and CD:
I didn't see any comments from those you chose to make public anyone arguing about the loss of harmonics in the digital over analog world. I have a relative, by marriage, that has "perfect pitch" and he has consistently held that he "hears" the loss in digital that exists in the analog. I haven't been able to do so with my very excellent equipment or with his, but I don't have "perfect pitch" hearing. The technical aspects of this he and I agree involves the fact that every tone has an infinite number of harmonics associated with it when it is sounded. Though the limit of human hearing stops at some frequency, max somewhere around 15 kHz give or take, the harmonics or sounds above what is heard still exists, we just are not aware of them. Some other animals, like dogs, hear higher ranges and would have the capability to hear the harmonics above human capabilities. In analog, all of the harmonics exist, at least theoretically, in a live performance, they certainly do.
As I am not an expert in the field of sounds/acoustics/etc... I can only speculate on whether or not harmonics, especially ones that can't be heard in the traditional sense are in some way processed with the audible portions of a tone within those individuals who we say have the gift of perfect pitch are the key mechanics that give them this ability or not. Further, I do not know at what point vinyl or CD/DVD/digital reproductions have cut-off levels on harmonics, if at all in the vinyl case, being analog ... As neither of us are professionally accomplished in this area, and until we hear something definitive, it has been our best idea of the mechanics of perfect pitch.
I also wonder if anyone has ever attempted to "record" the upper frequencies and mix them back into the signal to the speakers with the regular recorded audible track(s) with an audience of experts to get their feedback as to any perceived improvements to this sound vs. the non-mixed back in higher frequencies.
And finally, using the pricey sound system in a Lexus as an example, another reader offered a reminder on the subjective nature of how we may perceive audio quality:
[The Lexus' sound system] is okay when listening to FM radio, but when I played some CDs I was really disappointed. These were the same CDs I originally listened to many times on the factory radio/player in my Chevy Blazer, and they sounded great in my Blazer.
This got me to thinking: If I had been originally listening to these same CDs in the Lexus and then played them in my Blazer, would I have thought they sounded better in the Blazer? Or would it have been the other way around?
The same question comes about with vinyl vs. CD. If I had been listening to vinyl records all along and then played those same songs on a CD, what would I have thought? Well, obviously, if the vinyl records were old and scratched (like all records become eventually), I would think the CD sounded better. What if the CD player was cheap and had lousy speakers? The conditions under which we listen to music has great effect on how we perceive the quality.