Enhancing Nutt-Based Time-to-Digital Converter Performance with Internal Systematic Averaging

J.-P. Jansson, P. Keranen, S. Jahromi, J. Kostamovaara
2019 IEEE Transactions on Instrumentation and Measurement  
A time-to-digital converter (TDC) often consists of sophisticated, multilevel, subgate delay structures, when time intervals need to be measured precisely. The resolution improvement is rewarding until integral nonlinearity (INL) and random jitter begins to limit the measurement performance. INL can then be minimized with calibration techniques and result postprocessing. The TDC architecture based on a counter and timing signal interpolation (the Nutt method) makes it possible to measure long
more » ... me intervals precisely. It also offers an effective means of improving precision by averaging. Traditional averaging, however, demands several successive measurements, which increases the measurement time and power consumption. It is shown here that by using several interpolators that are sampled homogeneously over the clock period, the effects of limited resolution, interpolation nonlinearities, and random noise can be markedly reduced. The designed CMOS TDC utilizing internal systematic sampling technique achieves 3.0-ps root mean square (RMS) single-shot precision without any additional calibration or nonlinearity correction. Index Terms-Averaging, CMOS, delay-locked loop (DLL), integral nonlinearity (INL), jitter, Nutt method, quantization error, time interval measurement, time-to-digital converter (TDC).
doi:10.1109/tim.2019.2932156 fatcat:h2zk4emq6bhs3lducuwm6bv654