I find that interesting and would like to know detail on your measurement setup. When comparing an RTO1024 to the Agilent MSO7104B numbers and procedure outlined in Agilent's app note, the RTO not only outperformed the MSO7K at a 1 GHz BW setting but outperformed it in a full 2 GHz BW setting. Even without correction for the fact that Agilent overlooks that 8 div scopes and 10 div scopes set to the same V/div range are not looking at the same Full Scale digitizer ranges. Even Tek noticed this about the measurements, as there are two places in Agilent's 1 GHz RMS noise tables where the Agilent appears to outperform the Tek MSO4104, but in fact underperforms when comparing noise percent of FS voltage for that range. So I am fascinated by your observations and very interested in how you came to those conclusions.
ENOB comparisons can be tricky, especially when coming from competing vendors that will optimise the performance of their own instrument, while glossing over those same procedures on rented instruments from their competitors. You want to know what a vendor truly does, get their representative to actually show you. Agilent will start with their claims, using V/div which is faultly logic when decades old techniques for instrumentation expect equivalent full scale comparisons. Even their app note on this, overlooks the difference between 8 vertical divisions and 10 vertical divisions of competitors. Vertical noise can be manipulated on competitors scopes in vendor comparisons to dispute their own findings. Scopes like the R&S RTO 2 gig and below instruments, don't even interleave digitizers like Agilent and Tektronix, so they are indeed quieter instruments, especially when a self alignment or calibration has not been done. The number of scope users that are not even aware of self calibration and what is does, is astounding. So the potential for any scope to actually reach it's claimed possible performance in day to day operation is minimal. The user will more likely be plagued with DC offset errors than noise errors, for precise small measurments. Also, look at if your scope vendor drops bandwidth, or loses digitizer range by zooming in on a signal to "fake" a smaller vertical range. Think about the big picture of performance and capability, don't let a vendor get you hung up on small details while glossing over how the instrument addresses your application. The field personnel present to you, what their marketing staff sell to them. Make sure they're thinking when they present to you, and not just echoing selective marketing hype.
Excellent article, didn't know that scope manufacturers designed their own ADC solutions. I wonder if the technology in ADCs will get to a point where they do not have to expend that effort, but then again, with ever increasing bandwidth requirements design engineers will need from their scopes, the scopes need to be one step ahead of where the technology is.
The R&S claim of 7 ENOB is for the ADC by itself, and not the ADC in a scope. Since R&S doesn't sell the ADC by itself, users will never experience 7 effective bits on the RTO scope. The RTO series has a measured ENOB that much less: just under 6 at 500MHz, just over 6 bits at 1 GHz, and just under 6 bits at 2 GHz. Agilent 9000 is nearly identical (slightly higher than R&S at 2GHz, same at 1 GHz, and 5% lower at 500MHz).
Actually, there is an applicatioin note that can be found on the R&S web site: www.rohde-schwarz.us (1ER03) that describes, in detail how the ENOB is measured and shows measured ENOB values for both 1 and 2 GHz scopes