Voltage-to-time and current-to-time converters have been used in many recent works as a voltage-to-digital converter for artificial intelligence applications. In general, most of the previous designs use the current-starved technique or a capacitor-based delay unit, which is non-linear, expensive, and requires a large area. In this paper, we propose a highly linear current-to-digital converter. An optimization method is also proposed to generate the optimal converter design containing the smallest number of PMOS and sensitive circuits such as a differential amplifier. This enabled our design to be more stable and robust toward negative bias temperature instability (NBTI) and process variation. The proposed converter circuit implements the point-wise conversion from current-to-time, and it can be used directly for a variety of applications, such as analog-to-digital converters (ADC), used in built-in computational random access (C-RAM) memory. The conversion gain of the proposed circuit is 3.86 ms/A, which is 52 times greater than the conversion gains of state-of-the-art designs. Further, various time-to-digital converter (TDC) circuits are reviewed for the proposed current-to-time converter, and we recommend one circuit for a complete ADC design.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited