State-of-the-art Direct-Digital Measurement System for Phase-Amplitude Noise and Allan Deviation
Marco Pomponio, National Institute of Standards and Technology (NIST), University of Colorado, Boulder; Archita Hati, and Craig W. Nelson, NIST
Location: Seaview A/B
Date/Time: Wednesday, Jan. 29, 5:30 p.m.
The performance of radar and telecommunication systems are often limited by the amount of phase and amplitude noise in local oscillators and internal components. Therefore, measuring residual and absolute phase and amplitude noise can be critical for the overall capabilities of these systems, and different techniques have been developed and perfected, especially in the analog domain.
However, direct digital phase-amplitude noise measurements present many advantages in contrast to their analog counterparts, such as simplified near-to-the-carrier measurements, no need for sensitivity calibration, and asynchronous reference and DUT frequencies which removes the need for a phase-locked loop scheme. Since the first commercial introduction [Grove et al., 2004] of this technique, several other products have been developed over the past 20 years. However, to the best of our knowledge, the current best commercially available instrument shows according to its spec sheet a single-sideband phase noise-floor of L(1 Hz) = -137 dBc/Hz and L(1 MHz) = -190 dBc/Hz for a 10 MHz signal after 15 minutes of cross correlation averaging (the DNA series from NoiseXT), which translates to a frequency stability of about sigma(1 s) = 7×10^-15 with a 0.5 Hz bandwidth. When cross-correlation is removed, this instrument presents L(1 Hz) = -134 dBc/Hz.
With this abstract, we present a new state-of-the-art instrument with superior performance, able to measure phase, amplitude noise and Allan deviation, with and without cross-correlation, simultaneously. A cost-effective single FPGA solution based on the ZCU102 evaluation board performs all data processing with 100% sample utilization to maximize averaging throughput. A high-performance analog to digital converter (ADC) has been carefully selected. The ADC clocking, supply and front-end
have been meticulously designed to minimize the introduction of additional noise, all while keeping noise uncorrelated between the different components involved in cross-correlation. In the digital processing, custom design of the digital down conversion,
filters, and an innovative cross-correlation alignment scheme enable the achievement of the fastest averaging possible, surpassing any other commercially available instrument in terms of performance and noise level. The measurement system is controlled and monitored through a web interface.
Using modified evaluation boards, our instrument presents L(1 Hz) = -143 dBc/Hz and -155 dBc/Hz white noise for a 10 MHz full-scale input signal, which can be improved using cross-correlation to -185 dBc/Hz white noise after a few minutes and L(1 Hz) = -160 dBc/Hz in less than 2 days. Regarding residual Allan deviation measurements, the single channels exhibit a frequency stability of sigma(1 s) = 3.2 × 10^-15 with a 0.5 Hz bandwidth, which can be improved below sigma(1 s) = 5 × 10^-16 after 4 days of Groslambert co-deviation averaging. Our instrument performs about 8-9 dB better than the DNA series in the flicker region, where averaging is more time-consuming.
Moreover, a new multichannel prototype currently being developed at NIST shows a preliminary single channel residual frequency stability of about sigma(1 s) = 3 × 10^-16 with a 0.5 Hz bandwidth for a 100 MHz full-scale input signal, which can still be improved with the use of cross-correlation. This result is an order of magnitude better than the instrument presented, and the lowest ever reported for an instrument of this class.