
Bits, Noise, and Linearity; the Imperfections of ADCs
Choose Subtopic
The coding scheme matters as well. A 12 bit converter, operating in 2's complement binary mode, generates 11 bits of magnitude information, plus a sign bit. But what if we know the sign of the data in advance? Then coding in straight binary gives the potential for twice the resolution in the measurement. If the ADC codes as offset binary but the computer to which it is interfaced uses 2's complement binary, software to convert each incoming piece of data is required, slowing down data reduction. If an ADC is to directly drive a digital display, one must know if the display controller expects straight binary (plus sign), offset binary, or 2's complement binary coding, or the display will be in error even if the ADC is working correctly. Finally, some displays expect decimal input. While we did not discuss ADCs that directly digitize in decimal, they do exist (wasting coding space  the highest resolution for transistor or lead possible is with binary coding).






