# DAC Speed and Glitches

This section deals with non-idealities and may be skipped on a first reading.

When a DAC is set to a desired output code, it does not instantaneously jump to the desired output current or potential. It changes to the new output potential (or current) over a finite time (the settling time) and may get there with assorted jumps and signal spikes (**glitches**) rather than transitioning from old to new output values smoothly and monotonically.

There are several causes for such non-idealities:

a) switch capacitance and inductance

b) asynchronous switching

c) finite amplifer speed

For simplicity assume we're using a straight binary, 8 bit DAC. That way, the number representation corresponds to the figures in Ladder Networks and the numbers are small enough to be convenient (0 to 255).

A) Switch Capacitance and Inductance

The transistors that do the switching don't go from off to on (or switched left/ground switched right/virtual ground as in the diagrams in Ladder Networks) instantaneously. Because there are junctions between dissimilar materials inside the transistors, there is charge storage at the junctions. These act as capacitors, charge storage devices. When a digital 0 changes to a 1 or vice versa, the output current or potential response is slowed down by the charging and by the resistance of wires or other components in the external circuitry. It takes a time RC (resistance in ohms times capacitance in farads has units of time in seconds) for the response to get to 63% of its final behavior and 5 RC to get to 99.3%.
Even if the capacitance is small, a few picofarads (10^{-12} F) and the resistance is small, say 100 ohms, RC = 10^{-10}
s and the time to complete switching is 0.5 ns. That sounds so small as to be negligible, but in reality settling time for the ladder network involves much larger resistances. R in the network is typically 1 kilohm to 10 kilohm so that the currents are small and the power dissipiation reasonable. At 10 kilohm and 10 pF (quite realistic values), RC = 100 ns, and the ladder network settles in 0.5 μs. If one only wishes to make measurements a few thousand times a second, this still seems fast, but for many high-speed measurements in optical or mass spectroscopy, one wishes to make measurements at megahertz frequencies. In this case, the slowing of the DAC by RC dissipation limits the accuracy of the DAC.

But it's even worse than that. Any wire has inductance, the storage of magnetic energy in a field surrounding the conductor. This can set up oscillations, as energy is stored, alternately, as charge in the capacitor or magnetization in the field. Absent resistance, the frequency in Hertz is 1/(2 π
(LC)^{1/2}). For 10 nH (10 nanohenries inductance) and 10 pF, the frequency is 160 MHz. "So fast it couldn't matter." Except there can be little sine waves running around the circuitry, flipping high speed switches and adding noise! A well-engineered circuit will have low enough L and C that other problems, described below, will limit how rapidly a DAC (or other circuit component) will work. Circuit boards also have R, L, and C, and can degrade the function of even the best-designed components.

B) Asynchronous Switching

Suppose we want to increment the setting of a DAC from 00001111 to 00010000. Five switches have to be thrown: one goes from 0 to 1, while the last four go from 1 to 0. What happens if they don't all switch at the same moment (to a fraction of an attosecond, faster than any currently-available transistor)? We get settings that are NEITHER the initial nor final state! If the switches reset from least significant to most significant, the states will be

00001111 start

00001110

00001100

00001000

00000000

00010000 end

So instead of smoothly going from 15/256 of the reference voltage to 16/256, we go through 14/256, 12/256, 8/256, 8/256, and 0 on the way to 16/256. That puts out a downward voltage spike.

**Exercise**: if the bits reset from most significant to least significant, what happens?

In reality, the bit resetting isn't necessarily in a specific order. There is thus a "jumping around" of the output potential for a brief period while settings are in transition.

C) Finite Amplifier Speed

Suppose you put a water glass under an open faucet. Eventually, the glass overflows. But it doesn't overflow instantaneously -- it has to fill up first. Similarly, when a potential changes on an amplifer input, there is a delay, then a transition time, before the output tracks the input. Thus, even after the glitches are gone, the output amplifier on the ladder network takes some time to respond to the changed setting. How long? If R in the various diagrams is ~ 10 kΩ
and the amplifier input capacitance is 1 pF, RC = 10^{-8} s. After 50 ns, the amplifier has settled to with e^{-5} of its final value or 0.67%. That's 1 part in 148, or just a little better than 1 part in 2^{7} = 1 part in 128. For a 10 bit converter, to settle to 1 part in 2^{10} means waiting until enough RC time constants have passed to be within 1 part in 1024 or 0.0977%. That's about 7 RC time constants or 70 ns.

**Exercise**. How long must one wait for a 16 bit DAC to settle? Assume there are no glitches, only RC delays in the amplifer circuit with 10 kΩ and 1 pF.

In many instances, the actual stray capacitance is closer to 10 pF than 1 pF. If the amplifier is driving a coaxial cable, the cable has a capacitance of 4 pF/foot (12 pF/meter). That slows things down even more. Why not use smaller resistance? Then the current goes up together with power dissipation. Why not use smaller capacitance? Any pair of conductors has mutual capacitance, so there's a floor underneath the capacitance of the circuit.

The above argues for the RELATIVE error in jumping from one potential to another. What is the ABSOLUTE error? Let's consider two scenarios, both with a 16 bit straight binary DAC and with the same 10 kΩ resistor and 1 pF capacitance we've used in the previous exercise.

1) Big Jump. In the first scenario, a jump is made from 0 to 2^{16}-1 in a single step. We already know that in 110 ns, the DAC has settled within 1 least-significant bit (LSB) of the final value. But what voltage is that? Assume that the DAC has a range from 0 to 65535/65536 of 10.00000V. 1 LSB is then 1/65536 of 10.00000 V = 153 μV.

2) Little Jump. Suppose the setting is changed by only 1 count. Then the increment or decrement of the output is only 153 μV altogether. It will still take 110 ns to get within 1/65536 of the amount of the change, but can we detect that small a voltage? 1/65536 * 153 μV = 2.3 nV. In this case, how long it takes for the output to settle depends on the resolution of the measurement. If we can only measure 1 μV, then we, in effect, only have to wait to settle to 1 part in 153. But from above, that takes only 5 time constants or 50 ns. The bigger the jump, the longer it takes for the output to settle within our ability to detect change.