ADC home pageIntroductionGoalsWhy digitize?Number RepresentationDigital to AnalogAnalog to DigitalApplicationsLiterature Citations

Successive Approximations ADC

Choose Subtopic

Why stop at 1 bit? Why not 8 bits or 12 bits or 16 bits? In fact, we only need one more insight and all of these possibilities snap into focus. We need to set the most significant bit first. Let's work that out using a 2 bit converter. Again, use +5 volts full scale and have an input that we're trying to digitize of + 3 V. The values for a 2 bit DAC are -- oh, go ahead. Fill in the table:

DAC bits DAC Output V

So now we can see how the ADC would arrive at the digitized value for output.

1) Set DAC to 10. DAC puts out 2.5 V

2) Comparator determines that sampled voltage is at least as great as the voltage produced by code 10. The 1 stays set.

3) Set the next DAC bit to 1, for a coding of 11. DAC outputs 3.75 V.

4) Comparator determines that the DAC output is greater than the sampled voltage. The second bit gets turned off.

5) Final encoding: 10.

The digitization error in this case is 0.5 V; the resolution of the measurement is only 1.25 V, so the closest representation of a 3 V input we can have is 2.5 V.




DAC Ladder Networks DAC Speed and Glitches Scheeline Group Home Page Univ. of Illinois at Urbana-Champaign Home Page Department of Chemistry Home Page Creative Commons License System Homepage University of Illinois Homepage The Camille & Henry Dreyfus Foundation Homepage Home