Tag Archives: supply voltage

ADC performance evolution: Low-voltage operation – part 2


VDD evolution over time (scatterplot)

Fig. 1. Supply voltages used for scientifically reported CMOS ADCs over time. Data points representing the evolution of low-voltage state-of-the-art have been highlighted. Trend line fit to 1985-2007 data.

LOW-VOLTAGE EVOLUTION TRENDS: In part 1 of the low-voltage ADC series of posts, we observed the trends for supply voltage (VDD) vs. process scaling (L). In this second post we will complete the picture by looking at VDD trends over time. The timing for introduction of new process technology, and the nominal supply voltage for future nodes, are reasonably well-defined through the continuously updated International Technology Roadmap for Semiconductors (ITRS) [1], but at least two ADC-related aspects are not controlled by the ITRS scaling roadmap:

  • The rate at which mainstream ADC research activities will migrate to newer CMOS technology.
  • To what extent the ADC research community will attempt to push the envelope with respect to ultra-low voltage operation.

Regarding the former, it was observed in a previous post that the number of early adopters for each node is very small. In any year, the absolute majority of experimental ADCs have so far been implemented in technology being 2–5 generations behind the scaling front. How the “mainstream” will behave in the future is next to impossible to predict, as it is influenced by future industrial needs, research grant policies, research community group dynamics, journal and conference publication targets, as well as many other hard and soft parameters of which we know very little today.

The latter depends on a handful of pioneers choosing to explore the outer limits of ultra-low voltage ADC operation. It was seen in part 1 that there have been rather few attempts to push in this direction, which reveals that only a few groups have historically chosen this focus. If no one decides to have a shot at the current world record – the 0.2 V, VCO-based ∑-∆ modulator presented by Wismar et al. in [2] – we may never see it nudged.

It is therefore very difficult to predict the future VDD trends for analog-to-digital converters, both with respect to the ultra-low voltage state-of-the-art, and the mainstream supply voltage. What we can do, however, is to observe historical trends and use them as a reference.

Observation of ADC supply voltage trends

Figure 1 shows the voltage supplies reported for CMOS A/D-converters reported in scientific publications until Q1-2012. The graph shows the highest supply voltage applied to the circuit. It means that, if a circuit used several independent supplies, then VDD = max(VDD1, VDD2, …, VDDn), so that true low-voltage operation is promoted. The evolution of low-voltage state-of-the-art has been highlighted.

A similar graph in [3] shows data for all ADCs (CMOS, as well as bipolar and BiCMOS) but with data only to Q1-2010. Focusing on CMOS, and adding two more years of empirical data yields a different scatter. Nevertheless, the low-voltage state-of-the-art sequence here is nearly identical to that in [3] because the global state-of-the-art  almost completely coincides with CMOS ADCs, and also did not improve since 2006. As observed in [3], the lowest reported VDD remained unchanged at 5 V until 1985, after which it started to follow a noisy but distinct scaling trend for 20 years. Fitting to the state-of-the-art data from 1985–2007, yields that the lowest reported VDD was scaled by ~2× every five years during this period.

Figure 2 shows the distribution of scientific ADC implementations over supply voltage and publication year as a contour plot. The state-of-the-art data points and trend fit from Fig. 1 have been superimposed for reference. Just as in part 1, manually selected bin centers and non-linear contour levels have been used in order to render a meaningful and readable (but simplified) plot. The main purpose is to illustrate the difference between mainstream VDD and state-of-the-art low-voltage operation each year. It is observed that:

  • Mainstream focus remained at 5 V for over 20 years.
  • The state-of-the-art VDD scaling front started to go below 5 V around 12–15 years before any noticeable change in the mainstream focus.
  • The low-voltage scaling front appears to be approximately 5–6× below, and 10–15 years ahead of the mainstream VDD for each year.
  • Supply voltages from 5 V and down seem to have an extremely long lifetime in publications.

What do you observe?

Distribution of VDD over time (contour plot)

Figure 2. Voltage supplies used for scientific ADCs over time. Color represents number of publications. The low-voltage state-of-the-art data points are superimposed along with a scaling trend estimated from 1985–2007 data.

Future VDD scaling for ADCs

I’m very aware that there are good reasons why ultra-low VDD scaling may not be able go much further, so please note that I’m not saying here that it will. Perhaps it is physically impossible, or functionally meaningless to go significantly further than the 200 mV operation achieved by Wismar, et al. On the other hand, I’m old enough to have heard one “hard” limit after another being suggested for MOST scaling, and we’re still scaling them. So, let’s just see where we would end up if it should turn out to be possible also for the voltage supply:

If the current trend for ultra-low voltage ADCs should be maintained, the low-voltage pioneers would have to publish ADCs according to the following approximate schedule:

Year VDD
2015 73 mV
2020 36 mV
2025 18 mV

Again, I’m not saying that it will happen. But I still found it interesting to see what kind of supplies the historical trend is projecting towards. Does anyone dare to predict a hard limit for A/D-converter supply voltage? Do you believe we will ever see an ADC operating at 73 mV? Is 36 mV impossible? What are the possibilities in context of the impossibilities?

In case anyone wish to make their own projections, the trend fit expression is:

VDD = {10}^{-0.061011\times year + 121.7998}

See also …

ADC performance evolution: Low-voltage operation – part 1

ADC research trends: CMOS node adoption

ADC research trends: Migration to CMOS

ADC Survey Data

References

[1] International Technology Roadmap for Semiconductors (ITRS), 2011 Edition [Online]. Available: http://www.itrs.net

[2] U. Wismar, D. Wisland, and P. Andreani, “A 0.2V 0.44µW 20 kHz analog to digital ∑∆ modulator with 57 fJ/conversion FoM”, Proc. of Eur. Solid-State Circ. Conf. (ESSCIRC), Montreux, Switzerland, pp. 187-190, Sept., 2006.

[3] B. E. Jonsson, “A survey of A/D-converter performance evolution,” Proc. of IEEE Int. Conf. Electronics Circ. Syst. (ICECS), Athens, Greece, pp. 768–771, Dec., 2010.

ADC performance evolution: Low-voltage operation – part 1


Figure 1. Two-dimensional view of CMOS scaling: Channel length and VDD.

EMBRACING LOW VOLTAGE OPERATION: As analog-to-digital converter implementations migrate to scaled-down CMOS technologies, they also face the inevitable downscaling of supply voltages (VDD), and hence signal swing [1]. A signal chain with a weak signal is more likely to suffer from noise than one with a strong signal. The scaling trends for VDD are therefore important as a background to the A/D-converter noise performance trends that will be treated in a few upcoming posts. There are also other reasons for a circuit designer to keep an eye on the evolution of supply voltage, such as the considerable challenges for high-gain OP-amp design or sampling linearity under low-voltage operation, so I hope you’ll find this post useful even on its own.

Voltage scaling: The stragglers, the mainstream, and the pioneers

Contrary to the minimum channel length (L), we can choose to go lower than the nominal supply voltage specified for a process. Possibly at our own risk, but at least it can be done. It has therefore been possible to scale VDD “ahead” of the node you actually use. Unless you have direct access to a semiconductor fab, you can’t do that with respect to L. This degree of freedom – at least for experimental ADC designs – has lead to the situation illustrated by Fig. 1:

  • Some (or as we shall see below, most) designs use the nominal supply voltage recommended for any given CMOS node.
  • Others may use the same node, but the design is not “fully scaled”. It relies on higher-than-nominal VDD, and possibly optional process steps that effectively recreate older and less scaled device technology as well.
  • A third category not only embraces the full scaling, but actually use a more aggressive scaling of supply voltages. These are the low-voltage pioneers.

This post will observe how supply voltage distribute over CMOS node for scientifically reported ADCs, and attempt to extract evolution trends and trajectories for VDD vs. L.

ADC supply voltage vs. CMOS node

As pointed out in [2], the reported supply voltage can vary as much as one order of magnitude within the same node for scientific A/D-converters. This is illustrated by the scatter plot of {L, VDD} for the entire CMOS ADC data set in Fig. 2. The VDD used in the plot is the highest supply voltage applied to each ADC, and the evolution of low-voltage state-of-the-art over CMOS nodes has been highlighted. Wismar, et al., reported a 90 nm VCO-based ∆-∑ modulator implementation running at 0.2 V supply voltage (operational @ 0.18 V), which is the lowest VDD published to this date [3].

Note that Fig. 2 differs form a similar graph in [2] in that the graph here is based on two more years of empirical data, and the plot in [2] shows the lowest VDD applied to each design instead of the highest. Also, the trajectory for the de facto nominal supply voltage vs. CMOS node is overlaid in Fig. 2. It is not necessarily the “official” VDD, but instead it was derived from the supply used by the majority of designs reported for each node. For nearly all nodes, the choice was abundantly clear. In 65 and 90 nm, however, there were significant subsets of designs using 1 V instead of the 1.2 V that was used by the majority.

Starting at 1.2 µm, the ultra-low voltage state-of-the-art appears to have followed a distinct trend of evolving as approximately one fifth of the nominal VDD. Because of the slowdown in nominal VDD scaling, that trend still holds in relation to the 0.2 V reported by Wismar.

VDD vs. CMOS node (scatter plot)

Figure 2. Supply voltages used for scientific ADCs vs. CMOS node. The low-voltage state-of-the-art data points have been highlighted, and the nominal/majority VDD trajectory superimposed.

Although the scatter in Fig. 2 shows all reported combinations of {L, VDD}, it does not reveal the distribution across scientific ADC papers. This is done in Fig. 3, where color contours represent the number of papers falling into a certain two-dimensional histogram bin. The bins used in this plot have been selected manually in order to create a meaningful, yet readable plot, so that bin centers align with major nodes on the L axis, and the most frequently used or otherwise interesting values on the VDD-axis. Furthermore, a non-linear, truncated, ad hoc mapping of contour levels was applied to handle the steep peeks at certain bins while still retaining visibility of all non-zero bins. The contours thus yield a simplified view of the actual distribution, and cannot be used to derive the actual bin counts or exact distribution. For completeness, Fig. 4 shows the full distribution of all unique combinations of {L, VDD} reported for scientific ADC implementations until Q1-2012.

Figures 3 and 4 show that the vast majority of experimental ADCs reported in scientific papers use the nominal supply voltage for each CMOS node, even if there is a large spread of actual VDD values used in each node. The variation extends significantly below and above the nominal values. We can also observe that the scaling rate for nominal supply voltage with CMOS node appears to have leveled out after 130 nm. Projected VDD for future nodes are found in [1].

In part 2, we will look at the trends for VDD over time.

VDD vs. CMOS node (contour plot)

Figure 3. Distribution of supply voltage used for scientific ADCs vs. CMOS node. Color contours indicate density of publications. The low-voltage state-of-the-art data points are superimposed along with the nominal VDD trajectory.

Figure 4. Distribution of all unique combinations of VDD and L (node) reported for CMOS ADC implementations in scientific papers until Q1-2012. Bin grid is not to scale.

See also …

ADC research trends: CMOS node adoption

ADC research trends: Migration to CMOS

ADC performance evolution: Thermal noise

ADC performance evolution: Relative noise floor

ADC Survey Data

References

[1] International Technology Roadmap for Semiconductors (ITRS), 2011 Edition [Online]. Available: http://www.itrs.net

[2] B. E. Jonsson, “On CMOS scaling and A/D-converter performance,” Proc. of NORCHIP, Tampere, Finland, pp. 1–4, Nov. 2010.

[3] U. Wismar, D. Wisland, and P. Andreani, “A 0.2V 0.44µW 20 kHz analog to digital ∑∆ modulator with 57 fJ/conversion FoM”, Proc. of Eur. Solid-State Circ. Conf. (ESSCIRC), Montreux, Switzerland, pp. 187-190, Sept., 2006.